Overall Responsibility
To support the presales and solutioning team in building and demonstrating technical solutions for customers and prospects. The Solution Engineer will be responsible for hands-on technical tasks such as setting up demo environments, configuring software components, deploying on servers or cloud, and performing technical validation. The role serves as a bridge between technical solutions and business requirements, ensuring solutions are practical, demonstrable, and aligned with client expectations.
Key Responsibilities
- Provision, configure, and maintain technical environments for demos, proof-of-concepts
(POCs), and pilot deployments. - Set up software stacks using containers (e.g., Docker), virtual machines, and Linux-
based systems. - Support Solution Architects in creating and customizing demo scenarios based on real-
world use cases. - Automate environment setup and basic workflows using scripting (e.g., Bash, Python,
SQL, PySpark). - Troubleshoot and resolve technical issues during demo preparation or client
engagements. - Work closely with the Sales and Solution teams to understand customer needs and
tailor the technical solution accordingly. - Assist in preparing technical documentation, diagrams, system architecture overviews
and commercial costing. - Provide technical input during proposal, tender, and RFI/RFP processes.
- Keep up-to-date with emerging technologies relevant to data platforms, integration,
analytics, and cloud infrastructure. - Occasionally support onsite or remote client sessions for technical demos and
workshops.
Skills and Attributes
- Solid understanding of Linux administration, networking basics, and shell scripting.
- Hands-on experience with Docker, containerized environments, or other deployment
automation tools. - Has programming experience, preferably with Python, Shell scripting, and SQL in Data
and ETL ecosystem. - Ability to read and understand technical documentation and software architecture
diagrams. - Exposure to data platforms, databases, or integration tools is a plus (e.g., PostgreSQL,
Apache Superset, KNIME, Airflow). - Experience working with public cloud platforms (e.g., AWS, Azure, GCP) is an advantage.
- Strong troubleshooting, problem-solving, and analytical thinking.
- Good interpersonal and communication skills to work with both technical and non-
technical stakeholders. - Self-motivated and able to manage multiple priorities under limited supervision.
- Self-driven and able to take own initiative to learn and explore
- Capable of picking up new technologies and practices in a rapid manner
Minimum Qualifications
- Degree or Diploma in Computer Science, Software Engineering, Information Technology,
or related field. - At least 3+ years experience in a similar role, ETL/ELT development OR strong portfolio
of personal projects or freelance work.
Preferred (Optional but Advantageous)
- Certifications such as Cloudera, CompTIA Linux+, Docker Certified Associate, AWS Cloud
Practitioner, or equivalent. - Hands-on experience with Hadoop, Cloudera, Trino, Neo4j, and ETL Development.
- Experience with CI/CD pipelines or basic DevOps practices.
- Exposure to business intelligence, data engineering, or ETL concepts.
- Familiarity with Git and collaborative development workflows.