GIS Specialist

Ryan Mahoney

Why this role is hard · Ryan Mahoney

Finding the right GIS analyst for this role means looking past flashy software skills to find someone who can clean up messy transit electrification data on their own. You want a person who refuses to let poor map quality slide and catches wrong coordinates before they ruin the fleet routing model. The real test comes when a planned utility line runs into an unmarked pipeline. How they handle that situation shows whether they actually care about the final product or just check a box to move on.

Core Evaluation

Critical questions for this role

The competency and attitude questions below are where the hiring decision is made. They run in the live interview rounds and are calibrated to the level selected above.

16 Competency Questions

1 of 16
  1. Discipline

    Geospatial Data & Systems Architecture

  2. Job requirement

    Geospatial Database Engineering

    Performs routine data imports, executes schema validation checks, and maintains basic database structures under supervision.

  3. Expected at Junior

    Role operates under supervision on routine imports and schema checks; independent schema design or optimization is reserved for higher levels, requiring only basic working proficiency with guidance.

Interview round: Hiring Manager Technical

Walk me through how you incorporated survey or CAD files into an existing geodatabase for a past project.

Positive indicators

  • Outlines step-by-step import workflow with pre-validation
  • Notes escalation of format or schema mismatches
  • Maintains clear audit trail of changes

Negative indicators

  • Skips coordinate system checks or schema validation
  • Edits core system tables directly
  • Lacks documentation of source data lineage

12 Attitude Questions

1 of 12

Active Listening

Active Listening is the disciplined practice of fully concentrating on, understanding, and retaining both explicit and implicit communication from others, while consciously withholding premature evaluation or response. In technical and collaborative settings, it involves accurately capturing stakeholder requirements and operational constraints, reflecting back key insights to verify mutual understanding, and systematically integrating received feedback into decision-making and model development without defensiveness or cognitive bias.

Interview round: Recruiter Screen

During a handoff meeting, a construction manager describes a complex right-of-way boundary issue using informal terms and sketches. How do you process this information to update the geodatabase?

Positive indicators

  • Uses structured questioning to decode informal descriptions
  • Maps verbal cues directly to geodatabase attribute fields
  • Validates understanding with the manager before proceeding

Negative indicators

  • Makes assumptions about informal spatial terminology
  • Enters unverified data directly into production systems
  • Fails to document the translation process for audit trails

Stage 2 · Resume Screening

Read the resume against fixed criteria

Reviewers score every application that clears the door against the same criteria. Stronger reviews advance to live interviews; weaker ones are archived without further screening.

Resume Review Criteria

8 criteria
Evidence of processing, cleaning, and validating spatial datasets using industry-standard GIS software and scripting languages, with documented accuracy metrics or coordinate transformations.
Evidence of modeling physical constraints and spatial intersections for transit or utility infrastructure, such as turning envelopes, overhead clearances, or underground conflicts.
Evidence of ingesting GPS/tracking data and configuring real-time spatial visualizations or dashboards for operational monitoring.
Evidence of maintaining spatial database provenance, adhering to standard operating procedures, and documenting data sources for downstream reliability.

Is the resume complete, well-organized, and free from formatting, spelling, and grammar mistakes?

Does the resume show relevant prior work experience?

Does the cover letter or personal statement convey clear relevance and familiarity with the job?

Does the resume indicate required academic credentials, relevant certifications, or necessary training?

Stage 3 · During Interviews

Where the hire is decided

Interview rounds use the competency and attitude questions outlined above, then add tests, work simulations, and presentations that reveal deeper evidence about how the candidate thinks and works.

Coding Test

1 of 2

Live Interview · Coding Test

Without AI

Complete the function to identify overlapping geometries between clearance zones and ROW boundaries. Return a GeoDataFrame of conflicts that exceed the specified tolerance area.

Implement the spatial intersection logic using standard geospatial libraries. Ensure the function handles mismatched CRS inputs safely and filters out 'sliver polygons' (tiny false positives from digitization error) below the tolerance threshold.

With AI

Use AI to generate baseline intersection code, then critically evaluate and modify it to handle ambiguous municipal boundary overlaps and justify your tolerance strategy.

AI tools will easily produce a basic spatial join. Your task is to architect a robust conflict-detection pipeline that: (1) decides whether to use buffered intersections or direct overlays based on data provenance, (2) explicitly handles cases where ROW boundaries partially overlap clearance zones due to historical survey discrepancies, and (3) documents a defensible tolerance strategy that balances safety compliance with field verification costs. Modify the AI output to reflect your engineering judgment on these tradeoffs.

Response time

20 min

Positive indicators

  • Correctly transforms CRS to a common projected system before intersection
  • Uses efficient spatial join or overlay operations rather than naive iteration
  • Implements an area-based filter to exclude sliver polygons below the tolerance
  • Handles edge cases like null geometries or empty inputs gracefully
  • Explicitly rejects naive AI-generated joins in favor of a buffered overlay or distance-threshold approach justified by survey accuracy
  • Implements a configurable tolerance filter with clear documentation on how partial overlaps are classified (e.g., flag for review vs auto-reject)
  • Adds schema validation and CRS enforcement before spatial operations
  • Explains why specific AI suggestions were modified or discarded based on municipal boundary ambiguity

Negative indicators

  • Performs intersection in unprojected lat/lon coordinates, yielding inaccurate area calculations
  • Fails to filter digitization artifacts, returning noisy false positives
  • Uses inefficient nested loops instead of vectorized spatial operations
  • Lacks error handling for missing or malformed geometries
  • Accepts AI output verbatim, leaving sliver polygons unfiltered and tolerance thresholds arbitrary
  • Fails to address historical survey discrepancies, treating all overlaps as definitive violations
  • Lacks documentation on tolerance tradeoffs or boundary ambiguity handling
  • Does not demonstrate critical evaluation of AI-generated spatial logic

Presentation Prompt

Walk us through your approach to resolving conflicting municipal utility records and fragmented as-built surveys when producing an auditable corridor GIS dataset. Slides are optional; you can talk through your reasoning or bring a few reference artifacts if helpful. Discuss how you maintain strict data lineage, what QA/QC protocols you apply, and how you escalate anomalies to prevent downstream design rework.

Format

approach-walkthrough · 20 min · ~2 hr prep

Audience

GIS team leads and senior spatial data engineers

What to prepare

  • No formal slides required; a verbal walkthrough is sufficient.
  • Optional: 1-2 anonymized examples of past QA/QC checklists, data lineage logs, or conflict resolution workflows.

Deliverables

  • A structured verbal walkthrough of your QA/QC and data reconciliation process
  • Optional: 1-2 anonymized artifacts demonstrating how you track and resolve spatial data conflicts

Ground rules

  • Use only work you are permitted to share or describe hypothetically.
  • Focus on your reasoning, escalation paths, and quality standards, not proprietary client data or sensitive municipal records.

Scoring anchors

Exceeds
Proactively identifies hidden data lineage risks, defines robust multi-stage QA/QC gates, and articulates a clear, defensible escalation framework that protects downstream engineering.
Meets
Outlines a standard QA/QC process with reasonable tolerance thresholds and a basic escalation path for conflicting records.
Below
Fails to address data lineage or anomaly escalation, relies on ad-hoc fixes, or cannot articulate measurable quality standards.

Response time

20 min

Positive indicators

  • Asks clarifying questions about data source reliability before proposing a reconciliation method
  • Surfaces explicit tolerance thresholds and confidence intervals for spatial accuracy
  • Demonstrates a structured escalation path for anomalies that prevents downstream rework
  • Translates technical QA/QC protocols into clear, actionable steps for cross-functional teams

Negative indicators

  • Jumps to a solution without framing the data lineage problem or source conflicts
  • Relies on vague assurances of accuracy without defining measurable QA checkpoints
  • Dismisses the need for stakeholder alignment or escalation protocols
  • Overcomplicates the workflow without considering project scope or timeline constraints

Work Simulation Scenario

Scenario. You are tasked with consolidating three legacy municipal utility datasets into a single enterprise geodatabase to support a new zero-emission corridor project. The datasets use different coordinate systems, have inconsistent attribute schemas, and lack clear metadata lineage.

Problem to solve. Walk us through your approach to diagnosing the data quality issues, aligning the spatial and tabular schemas, and establishing a repeatable ingestion and QA/QC workflow that will survive engineering review.

Format

discovery-interview · 20 min · ~1 hr prep

Success criteria

  • Identifies critical missing information before proposing solutions
  • Articulates a clear, stepwise data reconciliation and validation process
  • Demonstrates understanding of coordinate transformations and metadata standards

What to review beforehand

  • Basic Esri geodatabase architecture and schema design principles
  • Common coordinate system transformation workflows
  • Standard GIS QA/QC protocols for legacy data integration

Ground rules

  • This is a discussion, not a deliverable production exercise.
  • Ask clarifying questions to gather missing context before outlining your approach.
  • You will be evaluated on how you frame the problem, surface assumptions, and sequence your decisions.

Roles in scenario

GIS Data Steward (informed_partner, played by hiring_manager)

Motivation. Needs a reliable, auditable dataset for downstream engineering design but lacks bandwidth to manually clean legacy records.

Constraints

  • Original data sources are partially archived and some field notes are missing.
  • The project has a fixed 48-hour SLA for the initial consolidated layer.
  • Engineering requires strict adherence to FGDC metadata standards.

Tensions to introduce

  • One dataset uses a local state plane projection while the others are in WGS84.
  • Attribute naming conventions are completely inconsistent across sources.
  • Previous attempts at merging resulted in topology errors that stalled permitting.

In-character guidance

  • Answer questions honestly and concisely.
  • Wait for the candidate to ask before providing details.
  • If asked about past workflows, describe them factually without suggesting improvements.
  • Acknowledge time pressure but do not volunteer shortcuts.

Do not

  • Do not volunteer information the candidate did not explicitly ask for.
  • Do not steer the candidate toward a preferred software tool or methodology.
  • Do not solve the data alignment problem for them or provide a step-by-step fix unprompted.

Scoring anchors

Exceeds
Proactively diagnoses data lineage gaps, sequences a robust QA/QC pipeline with explicit fallback protocols, and clearly articulates how the workflow ensures engineering-grade auditability under tight deadlines.
Meets
Asks necessary clarifying questions about projections and schemas, proposes a logical merge-and-validate sequence, and acknowledges metadata requirements without overcomplicating the approach.
Below
Assumes data compatibility without verification, proposes an unsequenced or overly simplistic merge, and fails to address topology, metadata, or SLA constraints.

Response time

20 min

Positive indicators

  • Asks high-information clarifying questions about source CRS, attribute dictionaries, and existing topology rules.
  • Explicitly surfaces assumptions about data lineage and proposes a validation checkpoint before merging.
  • Frames a phased approach that separates spatial alignment from attribute normalization.
  • References metadata standards and QA/QC protocols as non-negotiable baselines.

Negative indicators

  • Guesses projection parameters or schema mappings without asking for source documentation.
  • Freezes or defaults to generic statements like 'I'd just run a merge tool' without sequencing steps.
  • Fails to identify the risk of topology errors or metadata gaps before proposing a workflow.
  • Overlooks the 48-hour SLA constraint when designing a multi-step reconciliation process.

Progression Framework

This table shows how competencies evolve across experience levels. Each cell shows competency at that level.

Geospatial Data & Systems Architecture

3 competencies

CompetencyJuniorMidSeniorPrincipal
Geospatial Database Engineering

Performs routine data imports, executes schema validation checks, and maintains basic database structures under supervision.

Designs and maintains geodatabase schemas, implements automated ETL pipelines, and optimizes storage for complex spatial queries to support independent analytical workflows.

Architects enterprise spatial database solutions, establishes data governance frameworks, and mentors staff on advanced modeling best practices.

Defines long-term geospatial data architecture strategy, aligns database capabilities with organizational digital transformation, and sets enterprise-wide spatial data standards.

Spatial Data Quality Assurance

Executes predefined QA/QC scripts, flags geometric and attribute discrepancies, and documents validation results.

Develops automated validation workflows, establishes spatial accuracy thresholds, and manages formal data certification processes to ensure enterprise reliability.

Designs comprehensive QA/QC frameworks, implements continuous data monitoring, and leads root-cause analysis for spatial defects.

Establishes enterprise geospatial quality governance, integrates automated validation into CI/CD data pipelines, and sets industry-leading accuracy benchmarks.

Utility Network & Linear Asset Management

Maintains linear asset records, performs basic network connectivity checks, and updates attribute tables for routing and tracing.

Configures utility network topologies, validates complex tracing rules, and manages spatial relationships for multi-modal infrastructure to support grid constraint integration.

Designs advanced network data models, optimizes tracing performance, and establishes asset lifecycle integration protocols.

Defines enterprise utility network architecture, aligns spatial topology with engineering standards, and leads digital twin and network analytics strategy.

Spatial Analytics & Operational Business Value

3 competencies

CompetencyJuniorMidSeniorPrincipal
Infrastructure Siting & Compliance

Gathers site data, applies basic regulatory overlays, and supports environmental compliance mapping and documentation.

Conducts multi-criteria site suitability analyses, automates compliance reporting workflows, and manages spatial datasets for permitting and environmental overlays.

Develops complex siting models incorporating environmental, regulatory, and operational constraints, and advises on risk mitigation strategies.

Establishes strategic siting frameworks, integrates regulatory intelligence into enterprise GIS, and leads cross-agency spatial compliance initiatives.

Spatial Analytics & Routing Optimization

Executes standard spatial analyses, runs predefined routing models, and generates basic analytical maps for planning support.

Develops custom spatial algorithms, optimizes routing parameters for transit corridors, and integrates analytical outputs into operational dashboards to support fleet planning.

Leads advanced spatial modeling initiatives, validates analytical methodologies, and translates complex location intelligence into actionable business recommendations.

Pioneers next-generation spatial analytics frameworks, integrates predictive routing into strategic fleet planning, and drives cross-departmental adoption of location intelligence.

Telemetry Integration & Lifecycle Tracking

Ingests and cleans telemetry feeds, updates asset status records, and supports basic tracking visualizations.

Integrates real-time IoT streams with spatial databases, automates lifecycle tracking workflows, and develops operational monitoring dashboards for asset visibility.

Architects telemetry-to-GIS integration pipelines, optimizes data refresh rates, and aligns spatial tracking with procurement and TCO financial models.

Defines enterprise telemetry integration strategy, drives real-time spatial analytics adoption, and aligns lifecycle tracking with organizational financial and operational goals.