What will you do?
You will participate in projects focused on Big data systems. The technology stack you will have contact with: Hive, Impala, Spark, NiFi, HBASE, HDFS, Kafka, Kudu, Ranger and other.
What we expect:
- Technical university degree, especially with IT, Telecommunications, Mathematics studies
- Knowledge of SQL and Python
- Experience in Big data systems would be an asset
- Knowledge of Linux, GIT
- Availability and being open to learn
What we offer:
- Local and global cross-sector projects with results visible on the market
- Clearly defined career path
- Training and development programme (certification)
Experience level: Mid
- Graduates or students of the last years of technical studies, such as: computer science, telecommunications, electronics and information systems, mathematics, quantitative methods
- Knowledge of SQL language.
- Experience with Big Data technologies, especially with selected tools such as Spark, NiFi, Kafka and other technologies in combination with Hadoop (HDP 2.6, HDP 3.x) – Hive, HDFS, Spark.
- Knowledge of Python and/or Scala language basics.
- Ability to work in Linux environment.
- Availability and willingness to learn and develop.
Necessary on this position:
- Spark at Scala
- Hadoop (Yarn, HDFS)
- Hortonworks Data Platform stack
Project you can join
- On the first assignment, you will become a member of the team which is providing tools to support sales activities by providing business a powerful analytics tools. Those tools are built at the top of the Hortonworks Data Platform and can be exposed in Docker containers.
- You will be responsible for the implementation of Data Flows, which ingest data from different data sources, e.g., Kafka topics, SAP, S3 and store them in Hadoop, to process at the and by the Spark job.
- As a Big Data Engineer, you will work closely to Data Science Team on the implementation of analytical models and machine learning algorithms.
- During the project you will work closely with local team and some experts from Europe.
- You will use Slack or Skype to communicate
- Mostly you will work in AWS environment.
Work time division
- Bug fixing 5%
- New features 60%
- Documentation 15%
- Self-development 10%
- Meetings 10%
How we code?
- Version control: Git, Bitbucket
- Code review: experienced Team member
- IDE, Eclipse or other tools preferred by you.
How we test?
- Unit test
- Manual testing
- Test automation
How we manage our projects?
- Methodology: Scrum
- Who makes architectural decisions? Team and product owner
- Who makes technology stack decisions? Team
- Project management software: JIRA
- Team line-up: Developers: 15
- Additional monitor
- Operating system: Windows or Mac and Linux and system environments
- Tech supervisor
- Open space
- Separate rooms: meeting rooms
- Office hours: 8:30 – 17:30