Salary
60k+ TWD Annually
Required skills
Job description
RESPONSIBILITIES
- Analyze unstructured data for ingest
- Design and develop ingest processes
- Design and develop Solr schema for efficiency -> understand Solr
- Design and develop interfaces for application searches
- Design network architecture for Solr cluster
- Design and develop incremental update procedures
- Design and build data collection replacement strategy
- Define and configure Solr index schemas for multiple data elements
- Build Solr queries for numerous search requirements
- Design and develop search engine monitoring tools
- Monitor and audit search application performance through load testing, search testing, query response time testing and report activities on an on-going basis
REQUIREMENT
- Bachelor's degree in computer science or related field or an equivalent amount of work experience plus a minimum of 8 years of relevant work experience
- 4+ years of experience using and working with Apache Solr
- 4+ years of experience implementing Solr builds of indexes, shards, and refined searches across un-structured datasets to include architectural scaling
- 5+ years of experience in architectural design and resource planning for scaling Solr capabilities
- 3+ years of experience with automated techniques and processes for bulk indexing of large-scale datasets residing in databases or un-indexed systems.
- 2+ years of experience working on Scrum or other agile development based methodologies
- 2+ years of experience leading development efforts
ADDITIONAL EXPERIENCE:
- Hadoop specific experience
- Familiarity with Entity Resolution technology
- Familiarity with architecture, both logical and physical design
- Experience with distributed queuing with technologies like Kafka
- Engineering experience in Java or Scala a plus

Infocast
At Infocast, we offer value driven data analysis and processing, giving insight and guiding decision making.
Other jobs from this employer