FME is being recognized in mainstream IT/Tech Industry as a premier Extract, Transform, Load (ETL) tool. It is well known that FME support almost 500+ different data format and has a focus on geospatial data. The capabilities of FME Desktop and Server to support the execution of Application Programming Interface (API) calls is integral to supporting the integration and testing of cloud based data formats via a public facing API. Pivvot utilizes FME as a test harness and framework for it's internal geospatial ETL and Web Map Service (WMS) restful API end-point. This presentation provides an overview on how FME is utilized to test the veracity of Pivvot's public-facing API end-points.
Tony Scalese, Edgewater Ranzal Oracle Financial Data Management (FDM) practice director, presented "Getting the Most Out of FDMEE in a Multiproduct Environment" at KScope14.
When doing web application performance tests we usually present statistics from tools that simulates real users as a results from tests. This approach works for APIs and simple sites that does not require additional resource like javascript, css or image files. For most modern websites measurements collected that way give stakeholders information only about performance and stability of the first layer of application but give no information about performance seen from user perspective. To collect such information we need to use other tools and approaches. In my presentation I would like to share experience my team collected when designing and implanting performance tests strategy that aims to give 360 degrees view on application performance. Such solution gives developers fast feedback about performance changes, provide tools for testers to estimate site performance before going live and measures customer experience. In particular I would like to present: - rules for building high performance web sites - concepts of synthetic and real user measurements - when and what for use load generators - potential issues with test environment - metrics that could be collected - JavaScript Resource and Performance APIs - open source and commercial tools that can be used
The webinar will explore how one GIS project enabled MTEMC to transform the way GIS is used throughout the utility.
This document provides an introduction to working with different types of test steps in SOAPUI, including protocol-oriented test steps for SOAP and REST requests, flow control test steps, properties, data-oriented test steps using data sources and data sinks, and exercises for practicing with these step types. It covers creating and configuring REST projects using URIs, WADLs, and service discovery in SOAPUI.
Join us to discover actionable insights for elevating your airport operations. We've partnered with three airports and contractor teams to share real-world FME applications in airport environments: In Segment 1, hear from Hartsfield-Jackson Atlanta International Airport and Axim Geospatial on how FME streamlined the flow of data between their Cityworks Asset Management System and their GIS, enhancing airport staff's experience by reducing manual effort and paperwork duplication. Segment 2 will take you on Philadelphia International Airport’s geospatial roadmap journey, alongside their Spatial Services group, C&S Companies. Discover how FME is integral in bridging data integration gaps between BIM/CAD/GIS and their asset management platform, overcoming previous hurdles and improving data integration efficiency. In the final segment, join Woolpert in the exploration of the ETL processes implemented at Orlando International and Fort Lauderdale International airports, facilitating the smooth flow of data between BIM, GIS, and CAD systems. Learn the importance of data standards from firsthand experience with the development of automation processes. They’ll also share the significant time and cost-saving benefits reaped, and discuss future applications of this technology in data QA/QC tasks. Don’t miss the chance to discover how FME is transforming airport operations, paving the way for enhanced efficiency and reduced operational costs. Register today!
Pivvot utilizes FME at all levels of the data curation and ingestion process. Pivvot is a location intelligence company offering its customers cloud-driven, web-based siting and optimization software. All of the data that drives the Pivvot applications are geo-spatial. The data is derived from public sources and kept ever-green in the database. Understanding and documenting the provenance (Source) and pedigree (timeliness, concurrency) of each of the 500 million rows of data in the Pivvot database requires detailed attention to process and metadata. FME is utilized at all phases of the data curation (research and cleaning), transformation (standardization), and loading into the platform. The applications on the Pivvot platform can provide Pivvot's customers with the latest data that they need to make decisions on where to site their renewables projects. None of this would be possible without FME Desktop and FME Cloud. This presentation outlines how Pivvot uses FME to transform the data required to keep the Pivvot applications running.
This document discusses strategies for modernizing applications and moving workloads to Kubernetes and container platforms like Pivotal Container Service (PKS). It recommends identifying candidate applications using buckets based on factors like programming language, dependencies, and access to source code. It outlines assessing applications' business value and technical quality using Gartner's TIME methodology to prioritize efforts. The document provides an overview of PKS and how it can provide benefits like increased speed, stability, scalability and cost savings. It recommends starting projects by pushing a few applications to production on PKS to measure ROI metrics.
This document discusses strategies for modernizing applications and moving workloads to Kubernetes and container platforms like Pivotal Container Service (PKS). It recommends identifying candidate applications using buckets based on factors like programming language, dependencies, and access to source code. It outlines assessing applications' business value and technical quality using Gartner's TIME methodology to prioritize efforts. The document provides an overview of PKS and how it can provide benefits like increased speed, security, scalability and cost savings. It recommends starting projects by pushing a few applications to production on PKS to measure ROI metrics.
The document discusses different tools for monitoring application, network, and infrastructure topologies in an enterprise environment. It recommends tools that can monitor across all three silos and provide out-of-the-box support for most enterprise stacks, both on-premises and in the public cloud. Boundary, AppFirst, LogicMonitor, ScienceLogic, and DataDog are highlighted as tools that meet these criteria. Key selection factors include integration coverage, ease of use, cost, and example enterprise customers.
The document describes several case studies of using FME for spatial data processing and management tasks: 1) The Iowa Department of Transportation uses FME to ingest and publish snow plow data from vehicle trackers and dashcams for public viewing to improve winter driving safety. 2) A Spanish region uses FME to convert and publish spatial data on GitHub for open collaboration. 3) A water authority in Belgium created a comprehensive waterways data quality assurance process using FME and a Django interface to validate and manage results.
The document discusses Nanocomp's development of a manufacturing reporting and monitoring system called the Nanocomp Gateway. It evolved from an initial OQC reporting system in 2012 to include real-time process and production monitoring displays accessible via a status board. The system utilizes a mixed cloud approach with the Microsoft Azure cloud platform and local data processing. It was developed over 2012-2015 in a collaboration between Nanocomp and SoftColor involving 48 work days. Lessons learned included that PAAS saves time and rich client applications provide the best user experience. Future plans include integrating additional Azure and Office 365 capabilities.
FME Workbench is a software. For every software in Software Development Life Cycle (SDLC) testing phase must be implemented and performed. Developer by creating FME Workspace is writing code but via very friendly User Interface. Ready FME Workspace generates some output data which should be “tested somehow” against requirements to ensure the quality level is the highest. The answer on “tested somehow” is applying FME Integration Universal Test Framework (FIUTF) or idea and concept of this framework. The data testing concept is very simple, just compare (test cases assertion) FME Workspace writer output (actual) vs. prepared earlier “golden” output (expected) and collect test report (passed/failed tests). FIUTF is mainly FME Workbench which could be adapted in many ways (even in Jenkins CI/CD) on both - FME Desktop and FME Server. FIUTF generates test report automatically for the most popular input data formats as: CSV, GDB, GeoJSON, SHP, GPKG, GML, SQLite, DGN, DXF. Test report is prepared in very readable form in .html or .xls and contains metadata with tests cases as: tests data structure, tests data values, tests data geometry. Additionally, data visualization and potential data defects are presented on .html map to better identify where the issue exists. During presentation, author is going to demo FIUTF to engage audience to automating testing output data of the FME software which they like the most. For more details just visit FME Hub where the framework is deployed.