Job Description
THIS IS A 100% REMOTE ROLE. WE ARE OPEN TO MID LEVEL TO ARCHITECT LEVEL CANDIDATES
About NTENT
We are a unique group of brilliant minds intent on discovering, learning and building. This is an opportunity to tackle complex problems usually reserved for a handful of large companies in the search industry and build cutting edge Machine Learning and AI based applications.
NTENT provides a Platform-as-a-Service (PaaS), allowing industry partners to customize, localize and integrate Search technologies directly into their business-to-consumer offerings. NTENT utilizes advanced machine learning to decipher meaning and surface the most relevant answers.
About the Opportunity:
We are looking for a mid to architect level Java Developer who will have the ability to deliver world-class search engine technologies. You'll be working with a smart team of machine learning scientists and software developers on both full applications and tools that power NTENTβs powerful search platform.
- Duties and Responsibilities:
- Building the backend engine that runs our product. This includes extending our existing Machine Learning and Big Data pipelines and building entirely new capabilities, including:
- Big Data cluster, workflows and applications: data pipelines at scale, and real-time processing
- Machine learning and Data Scientist support: used in linguistics, ranking, classification, and other artificial intelligence applications
- Ingestion Pipeline: process data that comes from our web crawler which discovers and fetches content from the web and other sources
- Skills and Qualifications:
- Bachelors of equivalent years of experience
- Solid experience with Java programming (we also use Spring, Spring Webflux, Reactor, Netty)
- Multi-threading experience is a must
- Experience in scalable architectures and high-throughput application design
- Experience working with microservices/REST APIs (HTTP, XML, JSON)
- Comfortable in Linux and Windows environments.
- Prefer some experience with Big Data Technologies (at least one of the following):
- Hadoop ecosystem (HDFS, Hadoop, Hive)
- Spark
- Samza
- Kafka
- Aerospike
- Lucene NLP (Solr or ElasticSearch)
- The following experience is a plus:
- Machine Learning
- Continuous Integration and Deployment
- Tool usage: Git/Gitlab and IntelliJ
- Functional programming experience (Scala)
- Gradle
- Avro
- Docker/Kubernetes
Apply tot his job
Apply To this Job