Today has been the day that I have corresponded to expose in the course of real estate cadastre of Bolivia. The subject has been oriented to the reflection of how to choose a computer tool for a geomatic development.
This is the graph I have used, and my focus has been the analysis of the context in which we hope to implement the solution.
The point is that if you want to choose a simple tool for data capture you should consider aspects that are not only about the ability to make vectors, but rather the sustainability that can be supported to the extent required by users who They access it from different levels and the number of users that will require licenses.
Among some criteria we have considered, and whose weight may vary depending on the country context or scope, among others we can consider:
- OGC Standards
- Learning curve
- Speed vrs. Number of users
- Modular growth
- Availability of programming interfaces (APIs)
- Integral cost
We then divided the geomatic context into at least six stages and weighed the level of importance of the previous criteria at different times. Each of the stages can select a list of special features that the users or specialists propose and these are given a weight in order to evaluate in a comparative way the advantages and disadvantages of different solutions:
1. The construction stage
In this, it is basically expected that the solution is effective and practical for high level production by the technicians who come from the field, digitize, clean topology, integrate databases and interact with images or map services.
2. The administration step
In this it is considered, that the produced data can be submitted to standards such as to be accepted within a database or a versioned file manager. Aspects such as format sustainability and available API are very important. And of course, the solutions sought at this level for database management are expected to have attractive interfaces, can perform well for multiuser environments and the ability to store tabular data as well as geometry and raster indexes.
4. The Publishing Stage, At this level it is considered that the solutions of data construction have had possibilities of transformation to standards ogc and that the tools of data service have a level of personalization so that both data can be served as well as they are also artistically attractive.
5. The maintenance stage, this is a second level of construction, in which it is expected that the tools have the option to be able to customize their access for conservation of versioned results, historical storage of changes and again, ease in precise construction. If possible the option to make graphical annotation under an activex that works online ... better.
6. The backup stage, I have called it that, but it is actually a stage of access repositories, where users within the institution access, transform data, support and generate new products. Here the requirements to the CAD / GIS solution are barely going to the stability of format and ability to support versioning, while the management tools, which have a lot of development availability, security standards and client server functionalities.
3. The exchange stage, this is a second level of the publication, in which it is expected to serve data in xml, gml or other formats supported by ogc standards, products that we hope will be used by other geomatic solutions but also return modified. What to say, ability to pervert under geofumated standards, including the option of vector simplification ... yes, well geofumados.
Although the principle is to apply a feature test to different solutions at each stage of the process, we must not forget its integral context; so we have concluded with a quick exercise in case of having an important client, such as a cadastre institute in a country, that wishes to implement a complete system for an environment of some 20 CAD / GIS production technicians, 3 developers, 75 users intranet and multiple online consultation (we have omitted the costs of an Oracle of $ 30,000 per processor per year, computer development, equipment and implementation):
Do it with AutoDesk could cost up $ 180,000, With limitations in the phase of repositories that should complement with other brands and the performance of resources of the equipment to serve data efficiently and under high post-processing goals.
Do it with Bentley could cost up to $ 210,000, With the limitations in the phase of exchange, publication taken from the hair and something in the learning curve
Do it with ESRI could be up to $ 300,000, With the limitations in the construction phase and repositories, for which it would complement other brands; Apart from that in the way could arise that 10 licenses are required of an extension that is worth $ 9,000
Do it with Manifold could cost $ 15,000, With limitations in the construction phase, learning curve and the need for first-rate developers (although in all cases a lot of development is necessary). I also clarified that there are other low cost solutions, but I use this because I have tried it lately and it surprised me.
In the worst case I have $ 155,000 left to hire the development of good manuals and if I play with runtime licenses I can tempt the client's ego.
It is curious that almost the total can be done with FOSS, to pure GvSIG / Grass, Postgre, intelliCAD and other herbs if I manage to integrate a process systematization team, geofumated developers and the credibility to sell the project ... if the client had considered $ 700,000 ... I can hit him harder because the older he is The amount of users can justify more free or low cost software.
2 Raster Design
2 Civil 3D
|Navis Works? + Topobase||Development in the wild|
|Bentley||7 Benley Map
13 Bentley Cadaster
|Oracle 10G||Geoweb Publisher + interoperability
|Project Wise space||mmm ... to cry has been said|
|ESRI||10 Bentley Powermap
|GIS Server on another processor||One|
|manifold||Development of extras
20 Universal Manifold licenses
|Manifold EnterpriseOracle 10G||Universal Runtime||Runtime ultimate||Universal Runtime|
In summary, I hope to have aroused your curiosity for free and low cost solutions, although the time was too short to go further. We have defined several brief conclusions:
- The right technology is: "Which can be sustainable"Within the global context of development
- There can never be a "Good for everything"
- The "economic" aspect should be thought in terms of "Technology lifecycle"And its interoperability
- The documented processes (Systematization) Extend the life cycle of technologies
- Not all are ready for free software, to start is preferable applications "commercial”, With experience you can think of applications"Low cost", with the audacity at"free"Or" own "