Skip to content

Commit

Permalink
more datalake exp
Browse files Browse the repository at this point in the history
  • Loading branch information
gr211 committed Sep 4, 2024
1 parent 2191739 commit cb0aab4
Show file tree
Hide file tree
Showing 2 changed files with 35 additions and 35 deletions.
52 changes: 26 additions & 26 deletions examples/resume/experience.tex
Original file line number Diff line number Diff line change
Expand Up @@ -12,86 +12,86 @@
\begin{cventries}
\cventry
{Scala$\sqbig$java$\sqbig$go$\sqbig$api$\sqbig$openid$\sqbig$opentelemetry$\sqbig$sso$\sqbig$databricks}
{\logo{Disney streaming}{../../../images/disney.png} - Senior Scala developer}
{\logo{Disney streaming}{../../../images/disney.png} - Scala engineer}
{Luxembourg}
{October 2022 - present}
{Disney+ provides online content worldwide. I worked in the security and authentication team, providing a single sign-on openid seamless experience to users on this large estate, improving the performance and scaling, from handling 300M to 800M accounts.}
{Disney+ provides online content worldwide. I worked in the security and authentication team, providing a single sign-on seamless experience to users, and improving the performance and scaling from handling 350M to 800M user accounts.}
{
\begin{cvitems}
\item{Architectured apis handling over 800 million users, scaling up to 250k requests per second.}
\item{Engineered a single sign-on openid provider along with openapi and smithy, and kinesis and dynamodb streams.}
\item{Developed a pairwise pseudonymous identifiers (ppid) system with http4s and dynamodb.}
\item{Implemented security and obervability metrics for deployment pipelines and applications.}
\item{Developed a single sign-on openid provider along with openapi and smithy.}
%\item{Implemented and improved safety and obervability of deployment pipelines and applications.}
\end{cvitems}
}

%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$etl$\sqbig$kafka-streams$\sqbig$ksql$\sqbig$spark}
{\logo{Depop}{../../../images/depop.jpg} - Senior Scala developer}
{\logo{Depop}{../../../images/depop.jpg} - Scala\,|\,Data engineer}
{Luxembourg}
{March 2022 - September 2022}
{Depop is the UK and US leading second-hand shop for upcoming clothes designers. I was re-engaged to consult and implement streaming data pipelines and APIs feeding back into mobile apps in real-time.}
{Depop is the UK and US leading second-hand shop for upcoming clothes designers. I was re-engaged to design and implement streaming data strategies and APIs feeding back into mobile apps in real-time.}
{
\begin{cvitems}
\item{Engineered enriched unified datasets over ksql and spark streaming pipelines.}
\item{Engineered enriched unified datasets over ksql, kafka-streams and spark streaming pipelines.}
\item{Designed materialised streamed datasets with sub-15-minute views (delta tables) for olap queries utilising dbt, spark, and athena.}
\item{Implemented security and observability metrics for deployment pipelines and applications.}
\item{Implemented and improved safety and observability for deployment pipelines and applications.}
\end{cvitems}
}

%---------------------------------------------------------

\cventry
{Scala$\sqbig$java$\sqbig$api$\sqbig$etl$\sqbig$grpc$\sqbig$akka-streams$\sqbig$websockets$\sqbig$iot}
{\logo{Zego}{../../../images/zego.jpg} - Senior Scala developer}
{\logo{Zego}{../../../images/zego.jpg} - Scala\,|\,Data engineer}
{Luxembourg}
{Nov 2021 - Aug 2022}
{Zego is a drivers insurance company disrupting the market offering covers on a day-to-day basis. I designed and deployed flink and akka-streams based ingestion processors, applying ml-driven models to measure the risks (of claiming) levels and appetite for cover.}
{Zego is a drivers insurance company disrupting the market by offering covers on a day-to-day basis. I designed and deployed flink and akka-streams based ingestion processors, applying ml-driven models to measure the risks (of claiming) levels and appetite for cover.}
{
\begin{cvitems}
\item{Integrated third-party vehicle (iot) and insurance data over rest apis, grpc and websockets, leveraging kinesis and akka-streams for data flows.}
\item{Designed and implemented spring\,boot and akka-http micro-services and data processors.}
\item{Orchestrated ci/cd pipelines with kubernetes, argocd, and buildkite for the deployment and scaling of microservices and stream processors.}
\item{Implemented security and obervability metrics for deployment pipelines and applications.}
\item{Integrated third-party vehicle (iot) and insurance data over rest apis, grpc and websockets, leveraging kinesis and akka-streams.}
%\item{Designed and implemented spring\,boot and akka-http micro-services and data processors.}
\item{Implemented ci/cd pipelines orchestration with kubernetes, argocd, and buildkite.}
\item{Integrated iot data with tableau and snowflake into datalake.}
\end{cvitems}
}

%---------------------------------------------------------

\cventry
{Scala$\sqbig$java$\sqbig$etl$\sqbig$ksql$\sqbig$kafka-streams$\sqbig$confluent}
{\logo{Slice}{../../../images/slice.png} - Senior Scala developer}
{\logo{Slice}{../../../images/slice.png} - Scala\,|\,Data engineer}
{Luxembourg}
{Apr 2021 - Oct 2021}
{Slice provides the tech and services to grow the online presence of independent pizza vendors. I architectured and implemented streaming pipelines to offer real-time feedback on sales and revenues to those vendors.}
{
\begin{cvitems}
\item{Spearheaded migration to a high-throughput, stream-oriented data processing architecture.}
%\item{Provided guidance and led the development of stream processors on aws and confluent platforms.}
\item{Designed ci/cd pipelines for streaming data (kafka-streams, spark streaming, ksqldb) from databricks to confluent.}
\item{Integrated schema registries and schema evolutions into datalake.}
\end{cvitems}
}

%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$api$\sqbig$etl$\sqbig$emr$\sqbig$spring boot$\sqbig$kafka-streams}
{\logo{Compare The Market}{../../../images/ctm.png} - Senior Scala developer}
{\logo{Compare The Market}{../../../images/ctm.png} - Scala\,|\,Data engineer}
{London}
{Jan 2020 - Mar 2021}
{Compare The Market is a comparison site that helps aggregate quotes from various insurance providers. I came in to tune kafka clusters and stream processors. I also developed systems to ingest and provide up-to-date prices from live third-party feeds.}
{Compare The Market is a comparison site that helps aggregate quotes from various insurance providers. I came in to tune kafka clusters and stream processors. I also developed systems to ingest and provide up-to-date prices from third-party feeds.}
{
\begin{cvitems}
\item{Designed etl ci/cd pipelines for streaming data flows over spring\,boot, kafka-streams, akka-streams, and spark streaming.}
\item{Designed the architecture and strategy for a streaming infrastructure, promoting architecture to design forums (adf) and stakeholders.}
%\item{Defined and implemented timelines and workflows for the deployment of proposed technical solutions.}
\item{Designed data governance and retention strategies in datalake.}
\end{cvitems}
}

%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$etl$\sqbig$kafka-streams$\sqbig$ksql$\sqbig$redshift$\sqbig$rabbitmq}
{\logo{Depop}{../../../images/depop.jpg} - Senior Scala developer}
{\logo{Depop}{../../../images/depop.jpg} - Scala\,|\,Data engineer}
{London}
{June 2019 - Dec 2019}
{I joined the data engineering team to provide guidance and implement streaming data pipelines to maintain a redshift cluster in sync with real-time data.}
Expand All @@ -106,22 +106,22 @@
%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$api$\sqbig$etl$\sqbig$emr$\sqbig$kafka-streams}
{\logo{The Guardian}{../../../images/guardian.png} - Senior Scala developer}
{\logo{The Guardian}{../../../images/guardian.png} - Scala\,|\,Data engineer}
{London}
{Oct 2018 - May 2019}
{The Guardian is a leading UK newspaper. I designed systems handling hundreds of millions of events, ingesting data from kinesis streams into elastic search via logstash, and kibana for monitoring and observability. I designed and deployed spark streaming processors to extract real-time insights.}
{
\begin{cvitems}
\item{Implementation of spark etl pipelines on aws emr, processing readers' revenue and subscription datasets.}
\item{Designed and built alternative streaming pipelines with msk, eks, and kafka-streams.}
%\item{Implemented in-memory data stream uploads using http4s and fs2 to converse with rest apis.}
\item{Built datalake and catalog around emr and athena.}
\end{cvitems}
}

%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$etl$\sqbig$kafka-streams$\sqbig$cloudera$\sqbig$kerberos$\sqbig$hive}
{\logo{Sainsbury's}{../../../images/sainsburys.jpg} - Senior Scala developer}
{\logo{Sainsbury's}{../../../images/sainsburys.jpg} - Scala\,|\,Data engineer}
{London}
{Dec 2017 - Sep 2018}
{Sainsbury's is the largest UK supermarket chain. I designed the architecture, and built the plateform and pipelines to enhance the loyalty program and offer customers rewards and refunds in real-time.}
Expand All @@ -137,7 +137,7 @@
%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$akka$\sqbig$api$\sqbig$etl$\sqbig$spring boot$\sqbig$akka-streams}
{\logo{Home Office (HMPO \& IPT)}{../../../images/ho.png} - Senior Scala developer}
{\logo{Home Office (HMPO \& IPT)}{../../../images/ho.png} - Scala engineer}
{London}
{June 2016 - Nov 2017}
{HMPO and IPT are government bodies looking after border security and immigration policies. I worked on sensitive subjects such as asylum claims and passport issuance. I implemented systems handling data where security, audit and accountability where tantamount to the quality of the processes I designed.}
Expand All @@ -152,7 +152,7 @@
%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$akka$\sqbig$fullstack$\sqbig$play$\sqbig$etl}
{\logo{HMRC (Valuation Office Agency)}{../../../images/voa.png} - Senior Scala developer}
{\logo{HMRC (Valuation Office Agency)}{../../../images/voa.png} - Scala engineer}
{London}
{Nov 2015 - May 2016}
{The VOA is a government body that sets and enforces tax rates for private and business dwellings across the UK. I redesigned the ui and data input processing, and interfaced data flows with legacy back-ends. }
Expand All @@ -166,7 +166,7 @@
%---------------------------------------------------------
\cventry
{Scala$\sqbig$java$\sqbig$akka$\sqbig$play$\sqbig$rabbitmq}
{\logo{Home Office (Border Force)}{../../../images/ho.png} - Senior Java$\sqbig$Scala developer}
{\logo{Home Office (Border Force)}{../../../images/ho.png} - Java$\sqbig$Scala engineer}
{London}
{Mars 2014 - Oct 2015}
{Border Force is a government agency looking after the UK borders. I developed a real-time flight arrival prediction engine to anticipate queues and staff needed at the gates in airports.}
Expand Down
18 changes: 9 additions & 9 deletions examples/resume/skills.tex
Original file line number Diff line number Diff line change
Expand Up @@ -10,33 +10,33 @@

%---------------------------------------------------------
\cvskill
{Programming} % Category
{scala {} java {} groovy {} rust {} go {} python {} cats {} fs2 {} http4s {} play {} spring}
{Programming}
{scala {} java {} groovy {} rust {} go {} python {} cats {} fs2 {} http4s {} play {} spring {} html {} css {} javascript}

%---------------------------------------------------------
\cvskill
{Backend} % Category
{kafka {} kinesis {} ksqldb {} mongodb {} dynamodb {} rdbms {} amqp {} jms}
{Backend}
{kafka {} kinesis {} ksqldb {} mongodb {} dynamodb {} oracle {} amqp {} jms}

%---------------------------------------------------------
\cvskill
{Observability} % Category
{Observability}
{contract testing {} opentelemetry {} datadog {} smithy {} openapi {} swagger}

%---------------------------------------------------------
\cvskill
{Devops} % Category
{Devops}
{aws {} docker {} kubernetes {} terraform {} cloudformation {} jenkins {} circleci {} argocd}

%---------------------------------------------------------
\cvskill
{Data engineering} % Category
{etl {} spark {} spark\,streaming {} kafka\,streams {} akka\,streams {} uml {} flink {} pii {} gdpr}
{Data engineering}
{big\,data {} etl {} spark {} spark\,streaming {} kafka\,streams {} akka\,streams {} lineage {} flink {} pii {} gdpr}

%%---------------------------------------------------------
%
% \cvskill
% {Networks} % Category
% {Networks}
% {http {} dns {} ntp {} qemu {} ssl {} routing {} firewall }

%---------------------------------------------------------
Expand Down

0 comments on commit cb0aab4

Please sign in to comment.