И еще
один пример того, как, оказывается,
несложно, даже легко, использовать
распределенные вычисления (кластеры)
для анализа и визуализации в ГИС.
At this year's
DevSummit, we
announced the GIS
Tools for Hadoop project. Included in that project is a low
level geometry java API which enables spatial operations in MapReduce
jobs or the construction of higher level functions such as Hive
User Defined Functions. However, this geometry library is not
restricted to Hadoop MapReduce. It is used in Geo
Event Processor, and can be used in Storm
bolts or other parallel workflows. One such parallel workflow
processing engine is Pervasive
DataRush that I demoed at the DevSummit. Using the KNIME
visual workflow, I was able to process 90 million records (BTW,
small in the BigData world) in Hadoop File System, for heatmap
visualization in ArcMap.
So to recap, you can
store data into HDFS, or use the new GeoProcessing
tools for Hadoop. Compose DataRush workflows with spatial
operators that you export to be executed on a cluster from ArcMap,
whose result is consumed back by ArcMap for further local
GeoProcessing or visualization.
This is very cool!
This is very cool!
Все просто —
написать немножко кода на Java, реализующего
промежуточные вычисления; в визуальном
конструкторе составить workflow; купить
кластер и наслаждаться скоростью.
original post http://vasnake.blogspot.com/2013/07/geoprocessing-bigdata.html
Комментариев нет:
Отправить комментарий