This is a remote position.
Our Customer is a US based company in the CyberSecurity space. They have been the recipient of numerous awards and have established a solid reputation in their market, using cutting edge AI technology. We are currently looking to build a team in charge of developing and maintaining the Customer's web crawling infrastructure, using Python, Selenium and other web- scraping technologies. Our Customer currently crawls hundreds of sites on a daily basis, and needs to scale up their crawling capabilities.
- Bachelor or Master of Science degree in IT, computer science or equivalent disciplines
- At least 1 year experience developing or maintaining Python based applications
- Good knowledge of Linux (CentOS, Ubuntu, etc)
- Hands on experience with Selenium and understanding of automation
- Experience with Amazon Web Services or Azure is a plus
- Good master of version control (git)
- Fluent in English, written and spoken. Good communication skills
- Monitor the crawling executing and raise tickets for failed / non functional crawler and parser
- Resolve crawler and parser issues (defects, site changes etc) and verify fixes
- Inspect websites and come up with strategy and implementation for extracting data
- Develop site-specific crawler and parser code / selectors and test / execute
- Collaborate with other developers and QA to resolve issues
- Maintain crawler and parser code using git source control
- Proactively suggest improvements for the code bases
- Proactively suggest improvement for code and infrastructure for scalability and reliability (AWS)
- Full Time Employment with competitive salary and benefits
- Medical, dental, and vision insurance coverage
Please do not hesitate to apply. Applicants may email their details and resume to [email protected]
or apply online on our website.