As we did in our previous bootcamp dedicated to ReactJs (if you miss it, see what we organized in our dedicated Blog post!), also in this event we wanted to give young girls and women a concrete opportunity to get closer to the world of programming, and fill the skills mismatch that is - unfortunately - still high in the sector.
This time, the event was open to anyone: no technical prerequisites were asked to join the bootcamp, just a true passion for technology and the desire to learn something new.
The questions from the participants were numerous, all answered by Bitrock and SheTech’s Mentors through concrete examples, use-cases and in-depth explanations. After almost five hours of programming (including many coffee breaks, a super-tasty lunch and many games of foosball), the bootcamp ended with an interesting Q&A and feedback session, characterized by an open discussion on all touched points and topics. The general feedback that came from the participants was enthusiastic, and many proposals arose for other future events in order to continue exploring the world of Front-End and User Experience Engineering.
Keep reading our Blog and follow us on our social media channels to discover what other initiatives we have in store in partnership with SheTech!
Last month we had the chance to attend the amazing Kafka Summit 2022 event organized by Confluent, one of Bitrock’s key Technology Partners.
Over 1500 people attended the event, which took place at the O2 in east London over two days of workshops, presentations, and networking.
Lots of news was given regarding Kafka, the Confluent Platform, Confluent Cloud, and the ecosystem altogether. An incredible opportunity to meet so many enthusiasts of this technology and discuss what is currently happening and what is on the radar for the upcoming future.
Modern Data Flow: Data Pipelines Done Right
The opening keynote of the event was hosted by Jay Kreps (CEO @ Confluent). The main topic (no pun intended :D) of the talk revolved around modern data flow and the growing need to process and move data in near real time.
From healthcare to grocery delivery, a lot of applications and services we use everyday are based on streaming data: in this scenario, Kafka stands as one of the main and most compelling technologies. The growing interest in Kafka is confirmed by the numerous organizations that are currently using it (more than 100.000 companies) and by the amount of interest and support that the project is receiving. The community is growing year after year: Kafka meetups are very popular and numerous people express a lot of interest about it, as proved by the countless questions asked on a daily basis on StackOverflow and the big amount of Jira tickets opened on the Apache Kafka project.
Of course, this success is far from accidental: if it is true that Kafka is a perfect fit for the requirements of modern architectures, it is also important to remember how many improvements were introduced in the Kafka ecosystem that helped create the image of a very mature, reliable tool when it comes to build fast, scalable, and correct streaming applications and pipelines.
This can be seen, for instance, in the new features introduced in Confluent Cloud (the Confluent solution for managed Kafka) to enhance the documentation and the monitoring of the streaming pipelines running in the environment with the new Stream Catalog and Lineage system. Those two features provide an easy-to-access way to identify and search the different resources and data available in the environment, and how this data flows inside the system improving the governance and monitoring of the platform.
The near future of Kafka - Upcoming features
Among all the numerous upcoming features in the ecosystem presented during the event, there are some that we really appreciated and we had been waiting for quite some time.
One of these is KIP-516, which introduces topic IDs to uniquely identify topics. As you may know since the very beginning - and this holds also today - the identifier for a topic is its name. This has some drawbacks, such as the fact that a topic cannot be renamed (for instance, when you would like to update your naming strategy), since this would be required both to delete and recreate the topic, migrating the whole content, and to update all the producers and consumers that refer to that specific topic. An equally annoying issue is when you want to delete a topic and then recreate another one with the same name, with the goal of dropping its content and creating the new one with different configurations. Also in this scenario, we can currently face issues, since Kafka will not immediately delete the topic, but will plan a deletion that needs to be spread through the cluster without the certainty on when this operation will be actually completed. This makes the operation, as of today, not automatable (our consultants have often faced this limitation in some of our client projects).
The second long-awaited feature is the possibility to run Kafka without Zookeeper. At first, it was very useful and practical to take advantage of the distributed configuration management capabilities provided by Zookeeper (this is specifically important in processes like controller election or partition leader election). During the past years, Kafka has started incorporating more and more functionalities and also maintaining a Zookeeper cluster, instead of just the Kafka one, which feels like an unnecessary effort, risk and cost. As of today, this feature is not yet production-ready, but we can say that it’s pretty close. Indeed, Confluent has shared the plan, and we are all waiting for this architecture simplification to arrive.
The third upcoming feature that we found extremely valuable is the introduction of modular topologies for ksqlDB. ksqlDB is relatively recent in the Kafka ecosystem, but it’s having a good momentum given its capability to easily write stream transformations with minimal effort and just an SQL-like command, without the need to create dedicated Kafka-Stream applications that will require a good amount of boilerplate that, later, have to be maintained.
ksqlDB will not be able to complete the detailed development of some Kafka-streams but, for a good amount of them, it will be an excellent solution. The introduction of modular topologies will simplify the management of the streams inside ksqlDB, and it will simplify its scalability (which is currently limited in some scenarios).
Our Insights from Breakout Sessions & Lightning Talks
The inner beauty of tech conferences lies in the talks, and Kafka Summit was no different!
During the event, indeed, not only the feature announcements caught our attention, but also what was presented during the various breakout sessions and talks: an amazing variety of topics gave us plenty of options to dig more into the Kafka world.
One of the sessions that we particularly enjoyed is, for sure, the one led by New Relic (“Monitoring Kafka Without Instrumentation Using eBPF”). The contribution focused on an interesting way of monitoring Kafka and Kafka-based applications using eBPF without the need for Instrumentation. Antón Rodríguez, as speaker, ran a cool demo of Pixie, in which it was very easy to see what is going on with our applications. It was also easy to get a graphical representation of the actual topology of the streams, and all the links between producers to topics, and topics to consumers, easing answering questions like “Who is producing to topic A?” or “Who is consuming from topic B?”.
Another session that we particularly enjoyed was the talk by LinkedIn (“Geo-replicated Kafka Streams Apps”): Ryanne Dolan outlined some strategies to deal with geo-replicated Kafka topics - in particular in case of Kafka streams applications. Ryanne gave some precious tips on how to manage the replication of Kafka topics in a disaster recovery cluster to guarantee high availability in case of failure, and on how to develop our Kafka streams application to work almost transparently in the original cluster and in the DR one. The talk was also a great opportunity to highlight the high scalability of Kafka in a multi-datacenter scenario, where different clusters can coexist creating some kind of layered architecture composed by a scalable ingestion layer that can fan out the data to different geo-replicated clusters in a transparent way for the Kafka streams applications.
Undoubtedly, the event has been a huge success, bringing the Apache Kafka community together to share best practices, learn how to build next-generation systems, and discuss the future of streaming technologies.
For us, this experience has been a blend of innovation, knowledge, and networking: all the things we missed from in-person conferences were finally back. It was impressive seeing people interact with each other after two years of social distancing, and we could really feel that “sense of community” that online events can only partially deliver.
If you want to know more about the event and its main topics - from real-time analytics to machine learning and event streaming - be sure to also check the dedicated Blog post by our sister-company Radicalbit. You can read it here.
Last week we organized the first coding Bootcamp in collaboration with SheTech, after the strategic partnership signed in May 2021 with the common goal of bridging the gender gap in STEM and supporting women in the world of technology, entrepreneurship and digital.
More specifically, with this event we wanted to offer a real opportunity to bring girls and women from the digital and technology world closer to the world of programming, and fill the skills mismatch. As the recent Women’s Forum barometer on gender equity showed, indeed, women still play a secondary role in STEM, especially in the tech industry: in G7 Countries, the female presence in Data & AI is around 31%, while in Engineering only 19%.
After an initial briefing on the front-end scenario and Bitrock’s value proposition for Front-end Engineering, the bootcamp entered the battle zone. The participants split up in four different groups, each of one supervised by a Mentor, to start working on concrete exercises based on React Js.
What did we learn?
It’s the leading Front-End tool. It’s beyond being just one of the most fascinating development tools to learn. React.Js has climbed above Vue.js and Angular as the most in-demand front end development tool.
It has a thriving community of users. Many developers have a true love for the tool, which has transformed into close-knitted communities of loyal users. This explains why it is possible to find plenty of React.Js resources everywhere, from YouTube to GitHub.
React.Js allows for immutability. Every component built with React Js has two ways of working with data. This means you can build both stateless and stateful components; the choice lies in what you are aiming to achieve with the components.
The event turned out to be a great opportunity to network and meet new people with a strong passion for technology. The questions were numerous, all answered by Bitrock and SheTech Mentors through concrete examples, use-cases and in-depth explanations. After almost five hours of programming, the event concluded with an interesting Q&A and follow-up session, characterized by an open discussion on all touched points and topics. The general feedback that came from the participants was enthusiastic - many proposals arose for other future events, continuing exploring the world of Front-end and User Experience Engineering with workshops and labs.
Keep reading our Blog and follow us on our social media channels to discover all future events in partnership with SheTech.
If you want to access the bootcamp presentation deck and workshop material, send an email to email@example.com To find out more about Bitrock's mission to bridge the gender gap in STEM, tech and digital, and promote a workplace culture based on inclusion and gender equality, please visit https://bitrock.it/blog/equality-in-stem.html
Last week we hosted an exclusive event in Milan dedicated to the exploration of modern tools and technologies for the next-generation enterprise. The first event of its kind, it was held in collaboration with HashiCorp, US market leader in multi-cloud infrastructure automation, after the Partnership we signed in May 2020.
HashiCorp's open-source tools Terraform, Vault, Nomad and Consul enable organizations to accelerate their digital evolution, as well as adopt a common cloud operating model for infrastructure, security, networking, and application automation. As companies scale and increase in complexity, enterprise versions of these products enhance the open-source tools with features that promote collaboration, operations, governance, and multi-data center functionality. Organizations must also rely on a trusted Partner that is able to guide them through the architectural design phase and who can grant enterprise-grade assistance when it comes to application development, delivery and maintenance. And that’s exactly where Bitrock comes into play.
During the Conference Session, the Speakers had the chance to describe to the audience how large companies can rely on more agile, flexible and secure infrastructure thanks to HashiCorp’s suite and Bitrock’s expertise and consulting services. Especially when it comes to the provisioning, protection and management of services and applications across private, hybrid and public cloud architectures.
“We are ready to offer Italian and European companies the best tools to evolve their infrastructure and digital services. By working closely with HashiCorp, we jointly enable organizations to benefit from a cloud operating model.” – said Leo Pillon, Bitrock CEO.
After the Keynotes, the event continued with a pleasant Dinner & Networking night at the fancy restaurant by Cascina Cuccagna in Milan. Take a look at the pictures below to see how the event went on, and keep following us on our blog and social media channels to discover what other incredible events we have in store!
On June, 16th 2021 we held our virtual HashiCorp Vault Hands-On Workshop, an important event in collaboration with our partner HashiCorp, during which attendees had the opportunity to get a thorough presentation of the HashiCorp stack before starting a hands-on labs session to learn how to secure sensitive data with Vault.
Do you already know all the secrets of HashiCorp Vault?
HashiCorp Vault is an API-driven, cloud agnostic Secrets Management System, which allows you to safely store and manage sensitive data in hybrid cloud environments. You can also use Vault to generate dynamic short-lived credentials, or encrypt application data on the fly.
Vault was designed to address the security needs of modern applications. It differs from the traditional approach by using:
Identity based rules allowing security to stretch across network perimeters
Dynamic, short lived credentials that are rotated frequently
Individual accounts to maintain provenance (tie action back to entity)
Credentials and Entities that can easily be invalidated
Vault can be used in untrusted networks. It can authenticate users and applications against many systems, and it runs in highly available clusters that can be replicated across regions.
Thanks to our experts Gianluca Mascolo (Senior DevOps Engineer at Bitrock) and Luca Bolli (Senior Solution Engineer at HashiCorp) for the overview of the HashiCorp toolset and the unmissable labs session.
If you'd like to learn more about our enterprise offerings or if you want to receive the presentation slides, please reach out to us at firstname.lastname@example.org
To access the workshop recording, simply click here
We look forward to seeing you at a future Bitrock event!
Leveraging A.I., IoT and Stream Processing to enable the Smart Manufacturing paradigm
On June 3rd we hosted our webinar “An advanced Data Architecture for Manufacturing 4.0” in collaboration with our Partner Radicalbit.
During the event, attendees had the chance to learn more about the potential of the combination of A.I., IIoT and Stream Processing: a perfect blend of cutting-edge technologies that can enable the Smart Manufacturing paradigm, allowing companies to forecast in run-time production behavior, predict instantly economic and timing impacts, analyze good functioning of the equipment and, last but not least, have the predictive maintenance status in (near) real-time.
More specifically, attendees discovered how it is possible to transform data into information as quickly as possible and interpret it correctly thanks to RNA, a DataOps & MLOps enterprise-grade end-to-end platform designed to combine streaming event analysis and A.I., simplifying and accelerating developments in Advanced Analytics projects and Machine Learning enabled Decision Support Systems.
With the adoption of these cutting-edge technologies, companies in the Manufacturing industry can become more efficient, less resource consuming, services oriented (instead of merely product oriented), fast adaptable to business needs, and act autonomously leveraging human interaction.
Thanks to Roberto Mariotti (Technical Presales Manager at Radicalbit), Davide Fiacconi (Data Scientist at Radicalbit) and Cosma Rizzi (International Business Development at Bitrock) for sharing with us all your expertise in the field, and giving useful insights on the main challenges and trend topics of Manufacturing 4.0.
If you didn’t have the chance to attend the webinar live, or if you want to go back through the slides that were shown, you can access the presentation at the following link: https://bit.ly/3g4TR2v
If you want to access the webinar recording, or simply know more about RNA and Bitrock technology offering, send an email to email@example.com
Byteconf React is a 100% free conference with the best React speakers and teachers in the world.
The first edition was launched by Bytesized Code (whose mission is to innovate developer education online) in March of 2018. The conference, which represented a great resource for React developers of all experience levels, was streamed online on Twitch to over 900 people across the world.
Since then, the event has proved to be increasingly successful and highly appreciated by the audience, thanks to its ability to make the global dev community a small village and provide an awesome experience to everyone, in every corner of the world.
Bitrock was present during this year’s edition thanks to Claudia Bressi, brilliant Frontend Developer belonging to our Team, who was one of the official Speakers presenting her React Bandersnatch experiment project.
A great spotlight for Claudia and a source of pride for our whole frontend Team, whose expertise could be shared on one of the major stages for the global dev community.
The conference was streamed on YouTube, for free, so anyone and everyone could attend.
Codemotion is a platform devoted to developers that connect IT professionals, tech communities, and IT companies.
As we're a hub of innovation, we share the latest tech information and best practices among the tech community worldwide.
the coolest tech conferences in EMEA: 7 countries, 8 conferences, 570.000 developers!
very tech hackathons
training for IT professionals
school of technology for kids ... and more!
As Silver Sponsor of Codemotion 2018 (November 29-30 - Milan - Italy) we invite all developers and technology lovers to come and discover our company and meet our Front-end unit to explore relevant Job Opportunities.
Lambda World is the largest Functional programming event in the country, carefully crafted for you by 47 Degrees and the Scala and Java communities of Spain. The event takes place in Cádiz, one of the most beautiful cities in Spain, and includes Workshops, hands-on experience, hacking, and debugging.
Our colleagues Andrea Bessi and Alberto Adami will be our representatives in Spain.
Jenkins World brings together the DevOps community in two locations, providing opportunities to learn, explore, network and help shape the future of DevOps and Jenkins. DevOps World | Jenkins World is designed specifically for IT executives, DevOps practitioners, Jenkins users and partners.
2,500 attendees will attend this year, from all over the globe. They'll get access to 100+ workshops, training opportunities and sessions covering software automation, DevOps culture, performance measurement, security and more.
Bitrock is present with Matteo Gazzetta, Simone Ripamonti, and Andrea Simonini, members of Bitrock's DevOps Team. But not only DevOps are attending... Our Backend Developer Simone Esposito is attending too.
Nice, France | Palace of Congresses and Exhibitions Nice Acropolis
Our Team at the Event
Nice, France | Palace of Congresses and Exhibitions Nice Acropolis October 22-25