For over 35 years, we’ve been making history. Now, we’re creating the future.
Techs on the beach
On April 14, 1981, in Honolulu, Hawaii, William “Bill” Melton incorporated his new tech company. The original intention was to provide a recourse for retailers who’d been swindled by people handing out bad checks. Bill's product was a kind of verification device using phone lines—a verification phone.
But then a pivot. The credit card industry was relatively new at the time and processing was still being done with carbon paper. Bill saw a way to reduce costs for companies like Visa and MasterCard, while capitalizing on this new technology. Instead of verifying checks, we would provide local businesses a means to electronically process credit card payments. The Verifone payment device was born.
Now, Verifone is one of the world’s largest POS terminal vendors and a leading provider of payment and commerce solutions. We operate in more than 150 countries and employ nearly 6,000 people globally. Our steady growth has come organically, through a dedication to innovation and strategic partnerships, as well as from savvy acquisitions.
The main purpose of the Big Data Architect is to undertake aspects of the definition, design and development for enterprise-wide Data Strategy and delivery. Work closely between the Operations, Engineering and Product Management teams to oversee solution design, prototyping, delivery quality assurance and operational handover for data models, transformation and reports. Layout and embed the foundations for Strategic Data/Business Intelligence and Management Reporting. Embrace agile delivery methodologies and organize Data delivery methodology accordingly.
Your Essential Responsibilities:
- Architectural oversight for multiple components including systems providing engineering services to the enterprise.
- Examples of these services include Big Data platform, DevOps frameworks and tools to manage the big data platform
- Leads the analysis, definition, design, and provides oversight for construction, testing, installation and modification across multiple large, interdependent systems.
- Applies data governance processes.
- Partners with all stakeholders on large projects to provide architectural oversight throughout the life cycle of a project.
- Regularly sharing knowledge with colleagues on technical topics.
- Participation in design reviews and sign off any new data solution proposed by engineering.
- Should be able to quickly experiment new technologies if required in order to pick the right technologies
- Hands on technical expertise in key technologies such as Kafka, Mongo and if possible ( ELK, AppDynamics) Etc.
- Investigation and evaluation of new technologies
Your Knowledge and Experience:
- Kafka, Mongo, ELK, Green plum/Snowflake/Vertica
- Proven Kafka skills ( PAAS for shared services)
- Proven track implementing complete end to end Data programs.
- Deep understanding of Visualization & SQL Database technologies and cloud-based analytics products.
- Understanding SDLC.
- Strong customer stakeholder experience.
- Experience of delivering within an agile based Scrum team
- Successful delivery track record in a challenging, complex and dynamic environment.
- Ability to act and react positively in a dynamic and fast moving commercial and competitive