WeAreDevelopers

WeAreDevelopers World Congress 2023, the world's flagship event for developers, took place on 27-28 July 2023 in Berlin, Germany. A group of enthusiastic Bitrockers attended the conference to get the latest trends in software development and connect with like-minded professionals in the industry. Before the event, they decided to arrive a few days early to fully immerse themselves in the vibrant city of Berlin.

Kicking off

The day before the conference, the group went to the venue for check-in, to receive their welcome backpacks, and for some members to attend a workshop.

The workshop, led by Misko Hevery, the creator of AngularJS and an agile coach at Google, was informative and engaging. The group was so impressed that they decided to rebuild their personal website from scratch to incorporate some of the new and exciting features they learned about. 

During the workshop, the group had the opportunity to implement a project to gather data from their GitHub account, using authentication and CRUD operations. They used Qwik, a new framework focused on speed and performance. Although they were initially upset when the framework criticized React, which they are big fans of, they were still impressed with Qwik and plan to use it in their upcoming projects.

After kicking off the workshop with excitement, the group rested well, ready to face the challenges and novelties the conference would bring.

The first day

Upon arriving at the conference, the group was amazed by the number of people from all over the world waiting to be allowed in. The attendees included journalists, entrepreneurs, sponsors, developers, and speakers, all eagerly waiting for the event to begin.

The conference started with a remarkable opening speech by Tim Berners-Lee, a computer scientist best known as the inventor of the World Wide Web. He discussed the birth of the web and presented his vision for web 3.0 with Solid, a specification that allows people to store their data securely in decentralized data stores called pods. This specification aims to provide people with control over all their data and easy implementation for a single SSO, an authentication system and universal APIs. Shortly after this speech, the group joined another talk that proposed a different solution to the same problem using typical Web5 tools.

After these two different views on the same topic, the group attended a talk by John Romero, the original creator of Wolfenstein and DoomHe shared the story of the development of his games, including the difficulties and some funny anecdotes.

The group then attended a talk on CSS Houdini. It is a set of low-level APIs that extend CSS  by giving developers direct access to the CSS Object Model. Although the group was initially impressed, they learned that the most substantial APIs, such as the worklets for layouts and animations, have not been integrated and probably never will be.

The conference also showcased a talk called "You click, you lose: a practical look at VSCode's security". This talk highlighted how hackers are targeting developers and the risks of compromising a developer's tools. It also explored the security of Visual Studio Code's security and its vulnerabilities through real-world examples. 

The day ended with a talk by Jakub Oleksy, Vice president of Software Engineering at Github, who discussed the opportunities in the open-source community and the tools available to improve daily work. The main message that emerged from the talk was to recognize the work of others and contact them to show appreciation. 

No party no fun

The event organizers arranged various activities to keep the attendees engaged and entertained. 

Attendees participated in quizzes that tested their tech knowledge in a fun and interactive way during the day. As the sun began to set, the atmosphere shifted to a more festive one, and a Karaoke session brought out the hidden talents of many developers. 

The excitement grew as the night progressed with the final round of the Code100 competition, where young developers showcased their programming skills.

The DJ set curated by Sam Feldt, a popular disc jockey from the Netherlands, was the perfect way to conclude a long day with the right party vibes.

The second day

After the first day and the night party, the group was excited to immerse themselves in the technology world once again.

The event featured many speeches and seminars, but it was impossible to attend all of them, so the group decided to spend some time walking around the stands and talking with people. They realized that meeting people and confronting new problems made them feel like they are part of something bigger to make a significant contribution to the evolution of technology.

Later, another talk caught the group's attention: "Lies we Tell Ourselves as Developers", which focused on TypeScript and its type-checking system. Stefan Baumgartner, author of popular books about Javascript and Typescript languages, presented some problems that developers can encounter and how to solve them. The central idea of the talk was that developers' task is not just writing code but also making decisions, building architecture, and evaluating trade-offs between different solutions.

One of the talks that stood out was "The Quantum Leap: Redefining Computing and Its Applications", presented by Tomislav Tipurić, Chief Technology Officer at Nephos. This talk explored the potential of quantum computing, which can perform complex calculations in parallel and solve problems that are beyond the reach of classical computers. These revolutionary machines can perform complex calculations in parallel, unleashing their immense processing power in just a fraction of the time. The potential of quantum computing is huge and the group is still curious about what it could do for the future. 

Afterward, they attended "React from Another Dimension", a talk by Dan Abramov, the co-author of Redux and one of the key figures behind React. In this captivating session, he took the audience on a journey beyond the conventional realm of React, exploring new paradigms and pushing the boundaries of what this popular library can achieve. The talk began with an overview of React's evolution over the years, showcasing how the library has evolved and adapted to meet the ever-changing needs of modern web development. The true essence of the talk lay in its exploration of the uncharted territories of React, imagining what could have happened if JS and React would have been implemented as server-side languages instead of client-side ones.

The closing keynote took all attendees on a journey from the MS-DOS era to the frontiers of generative AI and quantum computing. It summarized all the main takeaways from the conference, emphasizing the progress made by technology as well as encouraging the audience to continue embracing innovation and the collaborative spirit of the developer community.

Goodbye WeAreDevelopers…for now!

The group of Bitrockers had an amazing time at the WeAreDevelopers World Congress 2023 in Berlin. The event was not just another developer conference; it was a unique and amazing experience that celebrated developers and provided a platform for knowledge sharing, networking, and fun. They recognized that sharing is an important part of their job and during their time in Berlin took advantage of every possible moment to do so. They are eager to keep growing their skills as developers and are already looking forward to attending next year's edition!

Author: Erik Fazio, Frontend Web Developer at Bitrock

Read More
Bitrock Smart Hackathon 2023

The Bitrock Smart Hackathon 2023, organized in partnership with our sister company ProActivity, ended with an exclusive Pitching & Award Event at Museo Nazionale della Scienza e della Tecnica in Milan. It was the perfect opportunity to meet the participants and spend some quality time with our colleagues, partners and clients.

The challenge

The hackathon challenge consisted in developing a smart tool to help employees that work remotely maintain social relations, stay motivated and collaborate with their colleagues.

The topic, i.e. how technology can enable and strengthen social bonds, attracted great interest. We are really satisfied with the achieved results: more than 150 registered people, 17 competing teams and a lot of inspirational and interesting hints concerning our topic.

The Winners

The final event was the icing on the cake, and we have to thank the hackathon’s organizing team,  Mentors and Judges, who all contributed to the success of the event. 

A special mention to our external Jury composed of Michela Bianchi, Chief of People and Sustainability Officer at Moneyfarm, Paolo Zilioli, Director IT Eyecare Systems at Luxottica and Federico Cella, journalist for Corriere della Sera, and our special host of the award ceremony.

For all those who couldn't be there, we would like to describe the winning projects that, more than others, have centered the hackathon aim: recovering human balance in a digital world.

The 1st classified team, the TATEAM (Niko Zarzani, Costanza Pollastrelli and Daniele Mariotto), won the hackathon with their App Workie-Talkie, ​​a dynamic workspace where employees can effortlessly collaborate on projects, exchange knowledge, and foster strong relationships.

The mobile App offers several features that promote and reward positive behavior: it offers breaks with colleagues available at that moment, it allows you to create and participate in thematic rooms for conversations via voice chat, and earn points by interacting and receiving reactions. The app also offers advice on managing workspace and concentration, and promotes physical health with notifications for active breaks, stretching exercises and daily quizzes. 

The 2nd team to stand on the podium was Team JM, with their innovative platform called Meetrock. A solution that combines new Social, Machine Learning and AI technologies to let you and your colleagues organize, join and enjoy a unique experience both inside and outside the office. 

With Meetrock you can easily create any type of event you want and let other people join with zero effort and no spam emails. Once the event is finished, every participant can send a report about the organizer and other participants to let Bitrock HR know if anything went wrong, guaranteeing continuous improvement and making the event always better than the previous one.

The last project we want to talk about is Social Buddy Bot, developed from 42 Monkeys, ranked third at Bitrock Smart Hackathon. 

Social Buddy Bot is an application that helps remote coworkers socialize. It’s integrated in the mainly used communication platforms and it uses AI to engage coworkers to do activities together. 

The activity of users is analyzed and, based on interactions, the system interacts with them: in case of a low rate of interaction, the system asks if everything is ok, trying to make the participants interact with others. The aim of this bot is to make people more used to remote work interaction and, given the appropriate amount of time.

Taking stock of our first virtual hackathon, we are really happy to have organized this type of event… we don’t want to spoil too much, but stay tuned to discover the next news we have in store for you!

Read More
Kafka summit 2023 Blog Post

The Kafka Summit 2023, held recently, brought together a diverse group of professionals, enthusiasts, and experts in the field of data streaming and event-driven architectures. This year’s summit was an exceptional gathering, filled with insightful discussions, cutting-edge demonstrations, and valuable networking opportunities. Of course, the Bitrock’s Engineering team couldn’t miss to attend it and to share the key insights from the event.

During the keynote presentation, Jay Kreps, Confluent Co-founder & CEO, presented a rundown of enhancements coming to Kafka over the next year and beyond.

After the Zookeeper removal in favor of KRaft (KIP-866) available from Confluent Platform 7.4.0, another big surprise announced is the KIP-932 Queues for Kafka which allows many consumers to read from the same partition, enabling use cases like the classic pub sub-queues. This will be made possible thanks to the introduction of share groups and acknowledgment of single records in Kafka Consumer protocol.

Jay also unveiled Confluent’s Kora Engine, the Apache Kafka engine built for the cloud. Kora is the engine that powers Confluent Cloud as a cloud-native, 10x Kafka service, bringing GBps+ elastic scaling, guaranteed reliability, and predictable low latency to 30K+ clusters worldwide.

Another important announcement made at Kafka Summit 2023 in London, is the upcoming Apache Flink-powered stream processing offering in Confluent Cloud, expected in winter 2023. The recent acquisition of Immerok by Confluent has positioned this data streaming giant to offer both streaming storage (via Apache Kafka) and streaming computation (via Apache Flink) capabilities.

After the keynote, the Bitrock’s Engineering team attended different talks during the summit, a very interesting one presented and explained the benefits of the new consumer rebalancing protocol, KIP-848: The Next Generation of the Consumer Rebalance Protocol .

The current rebalancing protocol has different issues, one is definitely that most of the logic is client side (fat client), for example, session timeouts and intervals are defined client side, but its main pain point is that the current protocol will stop processing new messages (strop the world) when executing a rebalancing. Fault group members might cause issues to the whole consumer group. The new protocol is based on three design goals in mind: Server side, Consumer Protocol and Incremental. The new reconciliation protocol has 3 main phases:

  • The group coordinator server side will receive the partition assignment from the members and compute the new assignment for the partitions due to a new member.
  • The group coordinator communicates which partitions should be revoked and the consumer acknowledges.
  • The partition can be assigned to the new member of the consumer group.

During the evening party, our colleagues enjoyed a beer and international foods while attending the performance by Sam Aaron, Live Coding Musician and Creator of Sonic Pi with his futuristic music sets improvised through the manipulation of live code.

The Kafka Summit 2023 was an outstanding event that showcased the advancements and future directions of the Kafka ecosystem which continues to be a driving force in enabling real-time data streaming and event-driven architectures in an increasingly data-centric world.

For sure we will not miss next year's edition! If you’d like to get more details about the Kafka Summit 2023, we invite you to also read the article from our sister company, Radicalbit.

Authors: Matteo Gazzetta, Team Lead Engineering at Bitrock and Daniele Bonelli, Team Lead Engineering at Bitrock

Read More

On May 16, we had the great pleasure to attend the 10th edition of the Cloud Conf 2023 held in Turin. A gathering of cloud computing enthusiasts had the opportunity to network, engage and establish connections with like-minded professionals from around the globe. 

Throughout the conference, a wide range of topics from Scalability, IoT, Machine Learning, Container, Microservices, Automation, Serverless Architecture to Cloud Security were covered in interactive workshops and technical sessions. In this report, we will delve deeper into the key takeaways and notable sessions from the conference highlighting insights and emerging trends shaping the cloud landscape.

The first keynote was ‘Developer Joy – How great teams get s%*t done’ by Sven Peters, Developer Advocate @ Atlassian. The main goal was to answer the question ‘what makes a team of developers more productive and happier? All considerations lead to the concept that the "joy of the developer" is the right intersection of his or her quality, progress, and values.

In the second keynote ‘From complexity to observability using OpenTelemetry’, Danilo Poccia, Chief Evangelist (EMEA) @ Amazon Web Services, showed through an end-to-end example how to use OpenTelemetry to instrument and collect telemetry data such as traces and metrics to build observable applications.

Another talk focused on cloud security was the one by Rob Barnes Senior Developer Advocate @ HashiCorp: “Migrating your security mindset to the cloud”. Barnes showed how to implement symmetric and asymmetric encryption of your application data. 

In the following sessions, many other topics were discussed. Just to name a few, Anahit Pogosova, Lead Cloud Software Engineer @ Solita, presented in her interesting talk ‘The Hidden Hero Behind the Serverless Superstar or Top 5 Cool Things Lambda Can Do For You’, how to implement and tune asynchronous architecture built in the Amazon cloud using Kinesis and Lambda technologies.

Ruben Casas, Staff Engineer @ Postman, in his speech ‘Micro-Frontends: The Evolution of Frontend Architecture’, guided us through the evolution of frontend application at scale analyzing the difference between monoliths, monorepos and MicroFrontends. 

We must highlight other contributions from Alberto Massidda, Production engineer @ Meta (Facebook) with his talk on ‘ChatGPT: Large Language Models explained well’ and Abdel Sghiouar, Cloud Developer Advocate @ Google Cloud on how to ‘Secure your software supply chain from dependencies to deployment’.

Once again, Cloud Conf 2023 in Turin was undoubtedly a resounding success, providing a platform for learning, collaboration, and inspiration. It showcased the incredible potential of cloud computing and its transformative impact on industries worldwide offering a comprehensive overview of the current state and future prospects of cloud computing. 

Are you interested in learning more about their rich lineup of keynotes speakers and the captivating Cloud Conf 2023 program? We invite you to visit their website https://2023.cloudconf.it/ for all the details and to learn from the best!

Keep following us on our Blog and social media channels to discover our upcoming events!

Author: Danilo Ventura, Senior Software Engineer at Bitrock.

Read More

Here we are, excited after joining the CloudConf 2022 event in Turin. This year’s edition, finally organized as an in-person conference after the covid19-pandemic, saw hundreds of professionals and developers from all around Italy gathering at the local Lingotto Congress Center. 

Now: what is CloudConf? Let’s put it shortly: a day full of engaging activities, networking, and talks dedicated to Cloud Computing. Among the different sub-topics (from development to operations), Sustainability & Green-computing were definitely in the spotlight: the main goal here was to evaluate strategies, actions and design choices in order to grant better energy efficiency for the solutions adopted when dealing with the Cloud.

Going into detail, our day started with the opening keynote by F. Bertani, Associate Solution Architect @ AWS. In his talk “Good old serverless, now in sustainable green”, he showed to the audience different techniques to improve performance and reduce costs within the AWS Lambda serverless ecosystem. An interesting starting point is the use of Rust to reduce the footprint of applications, as well as the use of good old profilers, especially AWS CodeGuru, where the use of JVM-based languages is unavoidable.

Another talk focused on energy efficiency was the one by A. VIvaldi, DevOps Architect @ Vista Technology: “Observability and security in the days of containers”. Vivaldi showed to the audience how the use of eBPF can enormously broaden horizons when dealing with system observability, while not compromising performance.

In the following sessions, many other topics were discussed. Just to name a few, A. Messidda, Production Engineer @ Meta, presented in his interesting talk “Similarity Detection in Online Integrity” the evolution of solutions to moderate contents (especially the visual ones) within online communities. For instance, Messidda showed the Machine Learning and Artificial Intelligence solutions developed by Microsoft and Meta to fight child sexual abuse, revenge porn and many other illegal activities. 

Equally interesting was the brilliant contribution by W. Dal Mut, Solution Architect @ Corley. In his talk “How to  build a realtime system with AWS IoT Core”, Dal Mut described how to implement a realtime system (such as a messaging system), using an AWS solution properly designed for IoT, but equally applicable to use-cases that are very different from Industry 4.0 and home automation. The use of AWS IoT Core as a websocket server completely managed by the Vendor, in particular, leads to interesting integration possibilities.

Among the contributions purely dedicated to architecture and development, we must highlight the one by U. Lattanzi, Software Services Team Leader. Lattanzi showed the foundations to develop scalable microservice  and cloud-oriented architectures, pointing out that a good architecture is the result of a team working together (architect, cloud engineer, data engineer, security and dev leaders), on a continuously evolving project. The speaker also shared his personal check-list used to evaluate the design of the microservices implemented by his team. Of course, this list does not represent a master key that can be used for all situations unconditionally. Nevertheless, we should take it as good advice.
Last but not least, we were impressed by F. Sciuti, CEO @ Devmy, even though his talk was more focused on Front-end topics. More specifically, Sciuti delved into the functioning mechanisms of modern browsers: from the complexity hiding behind their ease-of-use to their ever increasing need for hardware resources. What about the connection to the Cloud? Well, in this case we must talk about  “browser-isolation”: an upcoming technology, in which the final user will interact with the Web thanks to “light browsers” that will only display and send the interactions to a “centralized” browser (which may be in the cloud). This latter one will manage the hard work, granting a higher level of security within the company intranets and less waste of resources.

To find out more about the event’s key sessions and top speakers, please visit https://2022.cloudconf.it/

Keep following us on our Blog and social media channels to discover our upcoming events!

Authors: Simone Ripamonti, Team Lead Engineering @Bitrock - Danilo Ventura, Sr Software Engineer @ProActivity

Read More

Last month we had the chance to attend the amazing Kafka Summit 2022 event organized by Confluent, one of Bitrock’s key Technology Partners.

Over 1500 people attended the event, which took place at the O2 in east London over two days of workshops, presentations, and networking.

Lots of news was given regarding Kafka, the Confluent Platform, Confluent Cloud, and the ecosystem altogether. An incredible opportunity to meet so many enthusiasts of this technology and discuss what is currently happening and what is on the radar for the upcoming future.

Modern Data Flow: Data Pipelines Done Right

The opening keynote of the event was hosted by Jay Kreps (CEO @ Confluent). The main topic (no pun intended :D) of the talk revolved around modern data flow and the growing need to process and move data in near real time

From healthcare to grocery delivery, a lot of applications and services we use everyday are based on streaming data: in this scenario, Kafka stands as one of the main and most compelling technologies. The growing interest in Kafka is confirmed by the numerous organizations that are currently using it (more than 100.000 companies) and by the amount of interest and support that the project is receiving. The community is growing year after year: Kafka meetups are very popular and numerous people express a lot of interest about it, as proved by the countless questions asked on a daily basis on StackOverflow and the big amount of Jira tickets opened on the Apache Kafka project. 

Of course, this success is far from accidental: if it is true that Kafka is a perfect fit for the requirements of modern architectures, it is also important to remember how many improvements were introduced in the Kafka ecosystem that helped create the image of a very mature, reliable tool when it comes to build fast, scalable, and correct streaming applications and pipelines

This can be seen, for instance, in the new features introduced in Confluent Cloud (the Confluent solution for managed Kafka) to enhance the documentation and the monitoring of the streaming pipelines running in the environment with the new Stream Catalog and Lineage system. Those two features provide an easy-to-access way to identify and search the different resources and data available in the environment, and how this data flows inside the system improving the governance and monitoring of the platform.

Kafka Summit 2022 - Keynote (London O2)

The near future of Kafka - Upcoming features

Among all the numerous upcoming features in the ecosystem presented during the event, there are some that we really appreciated and we had been waiting for quite some time.

One of these is KIP-516, which introduces topic IDs to uniquely identify topics. As you may know since the very beginning - and this holds also today - the identifier for a topic is its name. This has some drawbacks, such as the fact that a topic cannot be renamed (for instance, when you would like to update your naming strategy), since this would be required both to delete and recreate the topic, migrating the whole content, and to update all the producers and consumers that refer to that specific topic. An equally annoying issue is when you want to delete a topic and then recreate another one with the same name, with the goal of dropping its content and creating the new one with different configurations. Also in this scenario, we can currently face issues, since Kafka will not immediately delete the topic, but will plan a deletion that needs to be spread through the cluster without the certainty on when this operation will be actually completed. This makes the operation, as of today, not automatable (our consultants have often faced this limitation in some of our client projects).

The second long-awaited feature is the possibility to run Kafka without Zookeeper. At first, it was very useful and practical to take advantage of the distributed configuration management capabilities provided by Zookeeper (this is specifically important in processes like controller election or partition leader election). During the past years, Kafka has started incorporating more and more functionalities and also maintaining a Zookeeper cluster, instead of just the Kafka one, which feels like an unnecessary effort, risk and cost. As of today, this feature is not yet production-ready, but we can say that it’s pretty close. Indeed, Confluent has shared the plan, and we are all waiting for this architecture simplification to arrive.

The third upcoming feature that we found extremely valuable is the introduction of modular topologies for ksqlDB. ksqlDB is relatively recent in the Kafka ecosystem, but it’s having a good momentum given its capability to easily write stream transformations with minimal effort and just an SQL-like command, without the need to create dedicated Kafka-Stream applications that will require a good amount of boilerplate that, later, have to be maintained. 

ksqlDB will not be able to complete the detailed development of some Kafka-streams but, for a good amount of them, it will be an excellent solution. The introduction of modular topologies will simplify the management of the streams inside ksqlDB, and it will simplify its scalability (which is currently limited in some scenarios).

Our Insights from Breakout Sessions & Lightning Talks

The inner beauty of tech conferences lies in the talks, and Kafka Summit was no different!

During the event, indeed, not only the feature announcements caught our attention, but also what was presented during the various breakout sessions and talks: an amazing variety of topics gave us plenty of options to dig more into the Kafka world.

One of the sessions that we particularly enjoyed is, for sure, the one led by New Relic (“Monitoring Kafka Without Instrumentation Using eBPF”). The contribution focused on an interesting way of monitoring Kafka and Kafka-based applications using eBPF without the need for Instrumentation. Antón Rodríguez, as speaker, ran a cool demo of Pixie, in which it was very easy to see what is going on with our applications. It was also easy to get a graphical representation of the actual topology of the streams, and all the links between producers to topics, and topics to consumers, easing answering questions like “Who is producing to topic A?” or “Who is consuming from topic B?”.

Another session that we particularly enjoyed was the talk by LinkedIn (“Geo-replicated Kafka Streams Apps”): Ryanne Dolan outlined some strategies to deal with geo-replicated Kafka topics - in particular in case of Kafka streams applications. Ryanne gave some precious tips on how to manage the replication of Kafka topics in a disaster recovery cluster to guarantee high availability in case of failure, and on how to develop our Kafka streams application to work almost transparently in the original cluster and in the DR one. The talk was also a great opportunity to highlight the high scalability of Kafka in a multi-datacenter scenario, where different clusters can coexist creating some kind of layered architecture composed by a scalable ingestion layer that can fan out the data to different geo-replicated clusters in a transparent way for the Kafka streams applications.

Conclusions

Undoubtedly, the event has been a huge success, bringing the Apache Kafka community together to share best practices, learn how to build next-generation systems, and discuss the future of streaming technologies.

For us, this experience has been a blend of innovation, knowledge, and networking: all the things we missed from in-person conferences were finally back. It was impressive seeing people interact with each other after two years of social distancing, and we could really feel that “sense of community” that online events can only partially deliver.

If you want to know more about the event and its main topics - from real-time analytics to machine learning and event streaming - be sure to also check the dedicated Blog post by our sister-company Radicalbit. You can read it here.

Authors: Simone Esposito, Software Engineer @ Bitrock - Luca Tronchin, Software Engineer @ Bitrock

Read More

A Joint Event from Bitrock and HashiCorp

Last week we hosted an exclusive event in Milan dedicated to the exploration of modern tools and technologies for the next-generation enterprise.
The first event of its kind, it was held in collaboration with HashiCorp, US market leader in multi-cloud infrastructure automation, after the Partnership we signed in May 2020.

HashiCorp's open-source tools Terraform, Vault, Nomad and Consul enable organizations to accelerate their digital evolution, as well as adopt a common cloud operating model for infrastructure, security, networking, and application automation.
As companies scale and increase in complexity, enterprise versions of these products enhance the open-source tools with features that promote collaboration, operations, governance, and multi-data center functionality.
Organizations must also rely on a trusted Partner that is able to guide them through the architectural design phase and who can grant enterprise-grade assistance when it comes to application development, delivery and maintenance. And that’s exactly where Bitrock comes into play.

During the Conference Session, the Speakers had the chance to describe to the audience how large companies can rely on more agile, flexible and secure infrastructure thanks to HashiCorp’s suite and Bitrock’s expertise and consulting services. Especially when it comes to the provisioning, protection and management of services and applications across private, hybrid and public cloud architectures.

“We are ready to offer Italian and European companies the best tools to evolve their infrastructure and digital services. By working closely with HashiCorp, we jointly enable organizations to benefit from a cloud operating model.” – said Leo Pillon, Bitrock CEO.

After the Keynotes, the event continued with a pleasant Dinner & Networking night at the fancy restaurant by Cascina Cuccagna in Milan.
Take a look at the pictures below to see how the event went on, and keep following us on our blog and social media channels to discover what other incredible events we have in store!

Read More
Corporate Event

Last week our Team gathered for our first corporate event of 2021, which turned out to be a great success for at least three different reasons.

To begin with, this was the first live event after the lengthy Covid-19 emergency. With proper precautions and respecting social distancing norms, we were able to meet in person at the event location - a cool and fancy restaurant in Milan - laughing, eating, and drinking together (as every proper event requires).

This occasion allowed many people to finally get together face to face: as we all know, seeing each other via  a computer screen may be fun and necessary these days, but meeting colleagues in the “real” world, shaking hands, sharing laughs and jokes is another story!

Secondly, this was the first official Fortitude Group event, with all team members from Bitrock, Radicalbit and ProActivity participating. A great opportunity to mark the new Fortitude era, after the 2021 Group rebranding.

Last but not least, events of this kind are also important since many colleagues that seldom have the chance to meet due to the allocation on different projects or different geographical location can finally spend some time together. During this evening, we finally had all people from Treviso, Lugano, Milano (and many other cities around Italy) together in one spot.

The event started with a welcome aperitif followed by a tasty dinner (typical Milanese cuisine...what else?!). Our CEO Leo Pillon took the chance to greet all participants and deliver a brief talk, addressing the challenges this period has meant for the Group, but also all the great results and success we were able to achieve while working remotely. It is a distinct corporate culture, a sense of togetherness and a clear direction that have fuelled the passion emerging in our daily work.

Curious to know more about the Bitrock world? Look at the pics below to get a taste of our event, and visit our Instagram page to discover much more!

We are now ready to start planning our next big event. Will you join us? 🙂

Sales & Key Client
Management, Sales & Key Client
Team Front End
Front End Team
Team DevOps
DevOps Team
Team Back End
Back End Team
HR & Marketing
HR & Marketing
Read More
React Bandersnatch Experiment

React Bandersnatch Experiment

Getting Close to a Real Framework

Huge success for Claudia Bressi (Bitrock Frontend developer) at Byteconf React 2020, the annual online conference with the best React speakers and teachers in the world.

During the event, Claudia had the opportunity to talk about her experiment called “React Bandersnatch” (the name coming from one of the episodes of Black Mirror tv series, where freedom is represented as a sort of well-designed illusion).

Goal of this contribution is to give anyone that could not join the virtual event the chance to delve into her experiment and main findings, which represent a high-value contribution for the whole front-end community.

Here’s Claudia words, describing the project in detail.

(Claudia): The project starts with one simple idea: what if React was a framework, instead of a library to build user interfaces?
For this project, I built some real applications using different packages that are normally available inside a typical frontend framework. I measured a few major web application metrics and then compared the achieved results.

The experiment’s core was the concept of framework, i.e. a platform where it is possible to find ready components and tools that can be used to design a user interface, without the need to search for external dependencies.
Thanks to frameworks, you just need to add a proper configuration code and then you’re immediately ready to go and code whatever you want to implement. Developers often go for a framework because it’s so comfortable to have something ready and nothing to choose.

Moving on to a strict comparison with libraries, frameworks are more opinionated: they can give you rules to follow in your code, and they can solve for you the order of things to be executed behind the scenes. This is the case of lazy loading when dealing with modules in a big enterprise web application.
On the other hand, libraries are more minimalistic: they give you only the necessary to build applications. But, at the same time, you have more control and freedom to choose whatever package in the world.

However, this can lead sometimes to bad code: it is thus important to be careful and follow all best practices when using libraries.


The Project

As initial step of my project, I built a very simple web application in React in order to implement a weekly planner. This consisted of one component showing the week, another one showing the details of a specific day, and a few buttons to update the UI, for instance for adding events and meetings.

I used React (the latest available release) and Typescript (in a version that finally let users employ the optional chaining). In order to style the application, I used only .scss files, so I included a Sass compiler (btw, while writing the components, I styled them using the CSS modules syntax).

Then I defined a precise set of metrics, in order to be able to measure the experimental framework. More specifically:

  • bundle size (measured in kilobytes, to understand how much weight could be reached with each build) ;
  • loading time (the amount of time needed to load the HTML code in the application);
  • scripting time (the actual time needed to load the Javascript files);
  • render time (the time needed to render the stylesheets inside the browser);
  • painting time (the time for handling media files, such as images or videos)

The packages used for this experiment can be considered as the ingredients of the project. I tried to choose both well-known tools among the React scenario and some packages that are maybe less famous, but which have some features that can improve the overall performance on medium-size projects.

The first implementation can be considered as a classic way to solve a project in React. An implementation based on Redux for the state management, and on Thunk as Middleware solution. I used also the well-known React router and, last but not least, the popular Material UI to have some ready-to-use UI components.
The second application can be considered more sophisticated: it was actually made of Redux, combined with the Redux-observable package for handling the middleware part. As for the ROUTER, I applied a custom ROUTER solution, in order to let me play more with React hooks. As icing on the cake, I took the Ant library, in order to build up some UI components.

As for the third application, I blended together the MobX state manager with my previous hook-based custom router, while for the UI part I used the Ant library.
Finally, for the fourth experimental application, I created a rather simple solution with MobX, my hook-based custom router (again) and Material UI to complete the overall framework.


Main Findings

Analyzing these four implementations, what I found out is that, as State manager, Redux has a cleaner and better organized structure due to its functional paradigm. Another benefit is the immutable store that prevents any possible inconsistency when data is updated.

On the other hand, MobX allows multiple stores: this can be particularly useful if you need to reuse some data (let’s say the business logic part of your store), in a different – but similar –application with a shareable logic.
Another advantage of MobX is the benefit of having a reactive paradigm that takes care of the data updates, so that you can skip any middleware dependency.

Talking about routing, a built-in solution (such as the react-router-dom package) is pretty easy and smooth to apply in order to set-up all the routes we need within the application. A custom solution, however, such as our hooks-based router solution, let us keep our final bundle lighter than a classic dancer.

Moving on to the UI side of the framework, we can say that Material is a widespread web styling paradigm: its rules are easy to apply and the result is clean, tidy and minimalistic.
However, from my personal point of view, it is not so ‘elegant’ to pollute Javascript code with CSS syntax – this is why I preferred to keep things separated. I then searched for another UI library and I found Ant, which is written in Typescript with predictable static types – a real benefit for my applications. However, in the end, we can say that MaterialUI is lighter than Ant.
Anyway, both packages allow you to import only the necessaries components you need for your project. So, in the end, it is a simple matter of taste which library to use for the UI components (anyway: if you’re looking more at performance, then go for Material!)


Comparing the Results

As final step, I compared the achieved results for each of the above-mentioned metrics.
From the collected results, it’s quite significant that the most performant application is built with the lighter bundle size and, in particular, when using Redux coupled with Thunk; MaterialUI for the UI components and our custom router, then the resulting application has a smaller output artifact and optimized values for loading, scripting and render times.

To cut a long story short, the winner of this experiment is application no. 4.

However, for bigger projects the results may vary.


Ideal Plugins

Frameworks, sometimes, offer useful auxiliary built-in plugins, such as the CLI to easily use some commands or automate repetitive tasks – which can in turn improve developers’ life. That’s why some of my preferred tools available for React (which I would include in a React ideal framework scenario) are:

  • the famous React and Redux extensions for Chrome: I found these essentials and useful as the ruler for an architect;
  • the ‘eslint-plugin’ specifically written for React, which makes it easier to be consistent to the rules you want to keep in your code;

- Prettier, another must-have react plugin if you use VS Code, which helps a lot in getting a better formatted code;

- React Sign, a Chrome extension that helps you show a graph and represent the overall structure in order to analyze your components;

- last but not least, the popular React extension pack that you can find as VS code extension, which offers lots of developer automated actions, such as hundreds of code snippets, Intellisense and the option of file search within node modules.


If you want to listen to Claudia’s full speech during the event, click here and access the official video recording available on YouTube.

Read More
Bitrock DevOps Team joining HashiCorp EMEA Vault CHIP Virtual Bootcamp

Bitrock DevOps Team joining HashiCorp EMEA Vault CHIP Virtual Bootcamp

Another great achievement for our DevOps Team: the possibility to take part in HashiCorp EMEA Vault CHIP Virtual Bootcamp.

The Bootcamp – coming for the first time to the EMEA region – involves the participation of high-skilled professionals that already have experience with Vault and want to get Vault CHIP (Certified HashiCorp Implementation Partner) certified for delivering on Vault Enterprise.

Our DevOps Team will be challenged with a series of highly technical tasks to demonstrate their expertise in the field. A 3 full-day training, that will get them ready to implement in a customer engagement.

This comes after the great success of last week, which saw our DevOps Team members Matteo Gazzetta, Michael Tabolsky, Gianluca Mascolo, Francesco Bartolini and Simone Ripamonti successfully obtaining HashiCorp Certification as Vault Associate. A source of pride for the Bitrock community and a remarkable recognition of our DevOps Team expertise and know-how worldwide.

With the Virtual Bootcamp, the Team is now ready to raise the bar and takes on a new challenge, proving that there’s no limit to self-improvement and continuous learning.


HashiCorp EMEA Vault CHIP Virtual Bootcamp

May 5–May 8, 2020

https://www.hashicorp.com/

Read More