The 6 Most Important Things I have learned in my 6 Months using Server-less

The 6 Most Important Things I
have learned in my 6 Months
using Serverless


Finding the right tools is of utmost importance in the world of serverless. October served as an eye opener to me and my company. After the Serverlessconf tour, I decided there and then that my company would operate on Serverless.

The first two months were a nightmare as we struggled to familiarize with the trend. Six months down the line, we are now deploying a major project servelessly, the fourth one in the company.

This article explains how we did it and provides the six most important lessons we learned in the process.

  1. The middle layer has to go

We have been developing web apps forever. This is one of the reasons why it took us a considerably long time to realize a very obvious advantage of Serverless.

Some of our first web apps still had a Node Express layer. This layer remembered session state either by accident or by the tragedy of design where we misused DynamoDB to make it remember sessions.

During the first phase of the transition, the middle layer served as a web server on Lambda. We later came to discover that this was wrong and terrible. The result was HTML pages full of JavaScript calling REST APIs. This approach that we employed was unmaintainable, but in the end, we managed to get rid of the middle layer. This is what should happen in Serverless.

  1. Love DynamoDB

Getting your grip around DynamoDB is arguably the hardest aspect of using Serverless. You encounter a few difficulties in the first interactions. Most of the time you want to go back to the more familiar RDS. Structured Querry Language (SQL) has been of great support for as far as I can remember. I have also put a lot of business logic into databases. RDMS systems, on the other hand, fail to scale well and do not support the concept of agile systems that have evolved organically.

DynamoDB is on a whole new level. Among the benefits of getting the NoSQL database right include massive scale, very fast performance, and zero administrative overhead. The table fields in DynamoDB should not contain empty strings. You will be forced to start over again when you get the partition and sort keys wrong. Emulating the SQL queries too closely can lead you from having few to very many tables.

After several attempts and finally succeeding with DynamoDB, I learned that:

  • Do not just jump into it without getting the facts right. Understand how it works first, lest you get disappointed and move back to RDMS.
  • It has extremely powerful tools. You can use streams to attach code to table events.
  • It can feed other storage systems. It can also be used to protect other databases from enormous data volumes.
  1. Authorization is key

Traditionally, you only had to authenticate yourself and then track them by following a session ID. The session ID controlled access. This was time-saving as you only needed to do the hard work once. But there are problems associated with this approach. First, it only works if the server is in the middle. In this case, the server has been burned to the ground. Second, it exposes you to malicious attacks like CSRF. It also makes passing identity to other services very difficult.

Nobody likes the CSRF attacks. This is where the JWT token comes in. Here is an illustration of how it works;

  • Step 1: get a JWT. You are awarded a JWT token after the authorization process.
  • Step 2: use it to communicate with the service you write

    Below are some reasons why you should use JWT;

    • The client can talk to more than one serverless activity
    • Every request is authenticated
    • It is secure.
    • It is anti-monolith,
    • It is CSRF-free

    The only thing needed from your serveless code is a Custom Authorizer to determine if the header is valid.

    However, JWT makes all other types of auth look complex. Therefore, we shifted to Auth0. Surely, serverless is extremely simple and very effective.

  1. Bye Python

While Flask was a nice framework in the traditional days, it is not suitable for the new world.

To support interactions, you will have to move more work to the client’s side. This will leave you with no other option but to use JavaScript. The result is always inlining into your Python templates.

Flask solutions have increasingly become a clumsy mess with different inefficient languages. After turning to Node, things were much more convenient, and I could use only one language. A simple Node configuration could help you use ES6 to get rid of JavaScript constructs. And that explains how we ditched Python to join the JavaScript bandwagon.

People using Python will boast about the impressive language features, but that is nothing compared to the charms of JavaScripts.

  1. Enjoy the Vue while it lasts

I was introduced to React when I entered the world of Single Page Applications. React is the most popular approach to developing user interfaces. It is great, but some of its drawbacks include a learning curve that is too steep, Webpack set up and JSX introduction. We chose to explore alternatives since it was too heavyweight for our immediate use.

I soon discovered  Vue.js and my life using serverless turned around. You can learn all about Vue in just a few days. Other significant advantages of Vue include:

  • Everything is a component which manages its own content, design, and code. This makes it easy to manage our clients’ projects, which are quite a huge number.
  • You are provided with powerful debugging tools, a Webpack and great organization from the open-source JavaScript framework.
  • Vue allows you to develop a desktop application within the browser. This way, you can improve the user experience.


In the old world, we would be deploying apps through Elastic Beanstalk. We would then monitor them for utilization and manage infrastructure. In SPAs, however, when you deploy an application, you are copying index.html, bundle.js, and some file dependencies to an s3 bucket which is front-ended by a CloudFront distribution. This grants you steady loading and distribution behavior. It also enables multi-version management.

There’s zero app infrastructure management.

  1. Serverless framework

My experience with Lambda in the first days involved coding directly into the AWS console. I often got stressed up as it took lots of work and mistakes to get some tasks done. It took me quite some time to discover the absence of the bridge connecting my IDE to the production environment.

All along, the solution had always been a serverless framework. Just a sls deploy deploy bundles up your precious code and ships it directly to Amazon’s brain. In case your code is misbehaving, and you need to check logs, type sls logs -f functionname -t.

This is a wonderful experience and the serverless people’s contribution should be applauded.


Cheers to new beginnings!

After attending A Cloud Guru’s serverless conference, we felt that this was clearly an unexplored area with limitless potential.

During our first experiments, we had our fair share of failures. The results weren’t satisfying. We have started to officially deliver projects in a 100% serverless way months after getting the right stack in place. All the migration difficulties and failures along the way were worth it.

We have commenced on a journey of building SPA apps which use serverless infrastructure and scale effortlessly. The apps also cost 70-90% less and the payoff is incredible. Serverless technology is going to revolutionize the delivery of applications in the cloud.

Weitere Beiträge

Bei Interesse an einer Partnerschaft, kontaktiere uns gern!

Reasons Why DynamoDB is Not for Everyone

Reasons Why DynamoDB is Not for Everyone


Amazon Dynamo was created in 2004 to scale the growth of Amazon’s Oracle database infrastructure. The aim behind its creation was to meet the business’s requirements (scalability, performance, and reliability).

In 2012, the availability of DynamoDB as a fully managed NoSQL data service was announced by AWS. AWS promised that it would have seamless scalability.

Why choose DynamoDB?

I interviewed a number of developers and engineers about their experience using DynamoDB. Even though this database service has many success stories, it has left behind many failed implementations. To fully understand why DynamoDB succeeds in some areas and fails in others, you first have to learn about the tension between two of its greatest promises- scalability, and simplicity.

DynamoDB is simple to use until it refuses to scale

Throwing data in DynamoDB is the easiest thing you can ever do. It is less complex as you don’t have to be worried about logging in and setting up a cluster- all thanks to AWS. To start operating this service, you just turn a knob, look for an SDK and sling JSON.

However, as much as DynamoDB is simple to interact with, designing its architecture is a difficult task. It works well during retrieval of individual records that may be based on key lookups. Where complex scans and queries are involved, there is a need to carry out indexing carefully. This is a must even if the amount of data isn’t huge and you are familiar with the design principles in NoSQL.

Most developers know a lot about classic relational database design but not much about NoSQL. A combination of inexperienced developers, the absence of a clear plan on modeling a dataset in DynamoDB and a managed database service is a recipe for failure.

First Law of DynamoDB

The first law of using DynamoDB is to assume that its implementation will be harder compared to employing a relational database you are well-versed with. At a small scale, a relational database will accomplish each of your need. Setting it up will initially take a long time compared to DynamoDB. However, the well-established SQL conventions will save you a lot of time in the long run. This is far from the assumption that DynamoDB technology is awful. It is because it is new to you.

DynamoDB can be scaled – until it’s not simple

For this article, I interviewed a few DynamoDB happy customers. DynamoDB promises great performance at an infinite scale which is only limited by the size of the AWS cloud. The customers are right in the center of doing key-value lookups on well-distributed records, avoiding complicated queries and limiting hotkeys.

DynamoDB is well-known for dealing with hotkeys and this is explained in detail in the DynamoDB developer’s guide documentation. Although it can scale indefinitely, data is not stored on a single server. As it grows larger, it is divided into chunks, each on a different partition.

Despite DynamoDB being able to scale indefinitely, your data is not stored on one, ever-expanding server. What happens is that the capacity of a single DynamoDB shard is divided into parts as your data increases. Therefore, each part lives on a different shard.

If you have a hot key in your dataset, you must ensure that the allocated capacity on your table is set high enough to handle all the queries.

With DynamoDB, you can only provision its capacity at the entire table level. You cannot provision its capacity per partition. By use of a fairly wonky formula the capacity is divided up among partitions. Consequently, your capacity for reading and writing on any record becomes smaller. If your application has too many RCUs on one key, you can do three things; over-provision all other partitions which are rather expensive, generate errors or decrease access to the key.

One thing to note however is that DynamoDB is not suited to datasets that are a mixture of hot and cold records. But at a large scale, each dataset has a similar mixture. You can split the data into tables, but you will end up losing the scalability advantage of DynamoDB.

A recently published article on  “The Million Dollar Engineering Problem” showed how Segment decreased their AWS bill. It did it by fixing the DynamoDB over-provisioning. Alongside the article was a heat map graphics that showed the partitions that were troublesome.

The graphics originate from AW’s internal tools. Their strategy for blocking the troublesome keys was to wrap DynamoDB calls in a try. ‘Segment then had to battle the hotkeys problem, and this is where the simplicity and scalability factors came in.

Designed as a black box, DynamoDB has very few user-accessible controls. When starting, it is this aspect that makes it easy to use. But at the production scale, you need more insight into your data misbehavior.

Second Law of Dynamo

The second law of DynamoDB states that DynamoDB’s usability, at a massive scale, is limited by its own simplicity. The problem here is with what AWS has chosen to expose, not Dynamo’s architecture. Its failure to backup 100TB DynamoDB data was the leading reason why Timehop moved off the service altogether.

What if you can’t use DynamoDB?

First, Let us look at the advantages and disadvantages of using DynamoDB


  • Handles huge amounts of workloads.
  • Enables you to redesign many applications.
  • Enables you to store state in a K/V table.
  • Enables use of event-driven architecture to suit your desires.


  • Only suitable for small scale.
  • You can’t redesign all the applications.

Just because you can use DynamoDB does not mean you should use it. Using Dynamo without fully understanding will make you end up spinning your wheels through several code rewrites before you land on a solution that works .

Third Law of DynamoDB

The third law of DynamoDB states that business value trumps architectural design each time.

This is the reason why the various developers I interviewed abandoned NoSQL to provide solutions for both small and middle-sized businesses. It is also the same reason why Timehop moved from DynamoDB to Aurora. This also explains why DynamoDB has lots of case studies from happy customers globally.

The Introduction of WhynamoDB

Amazon will at some point announce the release of WhynamoDB. The decision tree below will guide you on this new service.

Weitere Beiträge

Bei Interesse an einer Partnerschaft, kontaktiere uns gern!

Practical Tips and Tricks on How to Use Typography in UI Design

Practical Tips and Tricks on How to Use Typography in UI Design



Typography is the most challenging part of the UI Design despite having it in various forms from time immemorial. Owing to its long existence, we have theories, rules, and practices which we must keep up with. This article presents some practical typography tips and tricks to use in your projects.


Practical examples instead of theories

I will not dwell much on the theoretical part of typography though it is a fascinating subject. Instead, I will go directly into practice.

Consider Your Users

You’re not only designing for high resolution but for your users who are your main focus.

One of the most important aspects when it comes to users is the font. Therefore, use a flexible font that offers different weights, special symbols, and one that catches the eye. Good typography is appealing to a reader. Always pay close attention to these aspects.

If you know what makes letters readable, you will have a better overall understanding of which fonts to use for your UI. Let’s take a closer look at this!


Legibility is one of the most crucial parts factors. It refers to the ease with which one can differentiate one letter from another in a particular typeface. It’s micro-typography that focus on the typeface, letters, and details. You should note that not all typefaces have been designed with legibility as the core design function. Lack of a distinction between uppercase I and lowercase I is the most common problem. Therefore, avoid doing fonts to prevent illegal issues, especially on small displays.


Counters are the white spaces between letters. For instance in letters “d” “o” “u”. Professionals in typography believe that the counters enhance the letter recognition, the better the user.


95% of the letters that people read are in lowercase. Larger letter proportions between uppercase and lowercase create a more legible typeface.

Spacing of Letters

There is no definite way to calculate letter spacing, but in most cases, bigger text size requires lesser letter spacing. Always adjust the spacing manually when the typeface appears too open. In UI designs, this applies mostly in writing headers.


Lighter typefaces are normally more readable than heavier weights. It’s related to counters and allows for non-modified character shapes.

Wide proportion

Proportion refers to the width of a character in relation to its height. A wide letter is easier to recognize compared to a condensed one, and this improves legibility.


Readability is about the overall reading experience and plays a great role in encouraging people to read. You could have very good content on your UI, but the way it’s presented could affect how people read it. Is it easy to scan text layout, differentiate headings, subheadings, paragraphs, and blocks? Macro-typography is about making the text appealing to encourage reading. It achieves this through contrast, size, color and other minute details that improve the reading experience.


Though it depends on the project, I would suggest you pick your main color instead of using pure gray colors, and work as shown below.

Width of Text Block

If the text is too wide, your users will experience difficulties finding the next line. Again, if the text is too thin, the eyes of the reader will have to skip lines thereby breaking the reading rhythm.

As long as it not so frequent, our subconscious mind is energized when jumping to the next line.

Keep your readers energized and engaged by using between 50 to 75 characters in a text line.

Serif vs. Sans Serif

From history, serifs are more legible than san serifs. They were used in print for a long time, and they improved the reading experience a great deal. Serif allows the eye to flow more easily over the text. However, this is different when it comes to the web and mobile. There many sans-serifs that are readable and the modern state of visual design prefer simpler letterforms. As a matter of fact, there are usually more sans-serifs on the web and mostly on mobile.

Use of serif depends on your project and manner in which users read your content. Serif can be used if its long content, but if it is not, then they may be left out.


No serif on twitter app but it’s appropriate for medium

Height of Line

I highly suggest you use golden ratio when it comes to line height.

Generally, to get the perfect line height, multiply your letter-size by 1.618. You can use

You can also manually adjust the height if you’re more experienced. However, there are some exceptions to this rule.

The picture shows a small difference but it can have a huge impact on readability.

White space

The major role of white space in typography is to minimize the amount of text visitors see at a go.

The white space makes your design more scannable and reduces congestion with content. It directs the user’s eyes and creates a sense of order, elegance, and complexity.


The separator is a good way to divide your work into sections, and the most popular separator is simple line. Though it’s a subtle tool, it really enhances readability. An alternative way is to use the right now cards. They really do well, especially with unrelated content and certainly improve layout scannability

Repetition and rhythm

For me, I must confess that this part of a UI design really takes time. However, if done well, any repeated element promotes unity in design. The repetition may be done on positioning, padding, text size, margins, colors, background, use of rules, and boxes. Therefore, repetition produces rhythm.


Hierarchy dictates the order of going through content. It also gives a guideline on how to differentiate header from sub-header and body text. This is often achieved through the use of text sizes, paddings, margins, and contrast, among others. In order to improve readability, this is a technique that should be practiced in UI typography.

Weitere Beiträge

Bei Interesse an einer Partnerschaft, kontaktiere uns gern!

How AI, VR, and Big Data Will Transform the Real Estate Industry by 2020

How AI, VR, and Big Data Will Transform the Real Estate Industry by 2020


I came to learn of AI through my colleagues. I heard them speaking of a new feature that was about to be rolled out by our company. Throughout the conversation, I was blown away as it looked like the perfect thing. There was a lot of details on AI, and it had been on the news already. It made me happy to learn that our company was in the process of creating the next-generation rental platform that was AI-powered.

Excited about our company’s great move, I asked my colleagues to explain every detail of the feature to me. I came up with the following;

  • Rentberry was planning to launch lots of new features.
  • The future of the real estate industry is bright as these features will change how people rent and rent out.
  • Recent advancements in the technical field will save people money, effort and time.

I am that person that is not so excited about technological advancements. Maybe it is because I fear that one-day technology will take over and human beings will be deemed not important. But this information didn’t scare me at all. To say the least, it gave me hope for the future.

Let’s have a look at the details.


VR and AR Complementing Open House

Most landlords will argue that open houses can’t be replaced. But the new technologies are here to complement, not replace traditional things. While open house is still the best way to find out whether or not a house suits your needs, it has its layback. For instance, attending an open house can be expensive, time-consuming and irrational, especially when the property is miles away from where you reside.

How will it benefit you?

You can find yourself in a scenario whereby you’re planning to move to a completely different city and have no place to live. You obviously search rentals online and come across a few listings. Traditionally, you would spend money visiting the new city and attending open houses. However, in the world of AR and VR, all you do is put on a VR headset and get immersed into another reality. While VR serves as an alternative to open house, Augmented Reality can help potential customers visualize homes fully furnished.


Smart Search Tool for Tenants

Moving to a new place can be very hectic, especially when you don’t know the area. You’re not familiar with the traffic patterns or the available eateries, shopping malls, health centers, and so many other things. However, with this new system, algorithms will analyze your habits and preferences and suggest parts of the city that would be ideal for you. In fact, there are algorithms which can analyze your routine and find rental units that are perfect for you.

How Will You Benefit?

If rental apartments are powered by AI, it will be a lot easier on your side. This new feature will search through thousands of property listings and provide you a list of the ones you’re likely to like. Assumptions made by the algorithm will be based on your workplace, your housing budget, and preferred activities. Over time, the system will keep track of your preferences which will make assumptions more accurate.


Listings Powered by Big Data

In real estate, it is all about making calculated risks. Success in investment is never a result of sheer luck. When setting prices for properties, an analysis of the market is key. Real estate relies heavily on analysis and the more data you have, the more accurate the predictions.

This is where big data comes in. With big data, it is possible to set rental price, sell a property or buy-to-let while relying on statistical assumptions.

Rentberry is working on a feature that will help homeowners and landlords define the true value of their properties by using big data assets. This will help them keep their prices in line with the current market trends.

How Will You Benefit?

Imagine you are new in the real estate industry. You want to rent out but have no experience in the market. To make the matter worse, you have no time to study the market. Without the help of big data analysis, you will have to look for similar rental units in the neighborhood to calculate the average unit prices. AI will search the properties for you and give you the most reasonable prices for your property.


The Future is Bright

As someone that has full knowledge in IT and rental business, I can confidently say that there is a lot ahead of us. The future of real estate lies in improvements and advancements. It lies in getting quick, efficient results. It is all about solving problems by using smart algorithms. Therefore, if you still say nobody knows about tomorrow, then catch up young scholar.

Weitere Beiträge

Bei Interesse an einer Partnerschaft, kontaktiere uns gern!

10 Big Data Trends to watch in 2018

10 Big Data Trends to watch in 2018


Open Source

The future of Big Data will be dominated by the open source applications like Spark, Apache Hadoop, and others. Forrester Research claims that this tendency is increasing 32.9 percent per year.

The Internet of Things (IoT)

The Internet of Things (IoT) allows an access to data from various devices. Unsurprisingly, this year promises is to become a revolution in the interconnectedness of smart home technology. With the majority of people in the developed world routinely operating multiple devices, loT attempts to fill the void between data collection and enable communication between different sources.

Cloud computing

Businesses across the globe have started to use cloud computing for their IT activities. Primarily, many companies are using cloud to run the apps. Predictions shows that this percentage of companies relying on cloud computing will only continue to increase over the next few years. Big data Analytics is another important predicted trend in the IT world that influences not only how the cloud is used, but also addressing the growing demand and providing opportunities for innovation.

Machine Learning and AI

These technologies are on the spotline, and more companies have begun investing in it. Machine learning as a branch of AI, analyze big data without being explicitly programmed.


The arrival of messaging apps furthered the development of chatbots. Chatbot is a software program that conducts a communication with a user via textual methods. With a help of chatbots, you can carry out a number of different tasks, such as chatting with users, helping with purchases or pushing social media activities as much as like the famous app or also known as Crowdfire.

Predictive Analytics

Big data helps businesses to get the predictions of future behavior. In few words, predictable analysis brings the opportunity to customers ‘to know their customers’, and consequently maximize the profit and choose the right marketing strategy.


The last and arguably the most important field where changes must occur is cybersecurity. Today we live in two worlds, with the second being played out or inside of our computers. As our lives continue to advance and become digital, the demands for cybersecurity and policing of the Internet will continue to expand.

As cybersecurity adapts and develops to meet its society’s needs, its demand, function, and role will be forced to evolve due it its security. Anti-piracy companies will help by blocking illegal content, and new technologies will protect our data and computers from attacks, and other kinds of damage.

Business Intelligence

The companies’ decision-making will be based on big data. Even if it’s a small company, ignoring the importance of data science will lead to regressive management and loss of the profit. The use of Business Intelligence from Cloud will drastically increase. All decisions regarding market growth or expansion into new regions, or markets will be based on Big Data.

Big Data Focused Intelligent Apps

Such application incorporates Big data Analytics to provide personalization and improved service. AI and Machine learning will be implemented in almost every app.

General Data Protection Regulation (GDPR)

The EU’s General Data Protection Regulation (GDPR) has already taken an effect; this has got many organizations and companies to ensure they comply with such regulation.

When question were asked about the function of Big Data in their respective organizations, about 66 percent of the respondents stated that they took cognizance of it to be either “strategic” or “game-changing”. Only 17 percent stated that big data was nonetheless “experimental” at their companies, and some 17 percent described their efforts as “tactical.”

These various perspectives on big data make it clear that this is one trend that is not going to fade away anytime quickly.



Predictions were given about the future of big data and what companies would face if they see big data as a strategy. Our company, MicroMoney, uses Big Data to provide our customers with the best service, and banks with the most accurate information to serve customers they couldn’t serve before.

Big Data and analytics will drive modern business operations, and will not certainly reflect their overall performance; corporations will take an encompassing approach to facts and analytics.

Organizations will create end-to-end architectures that will allow data management and analytics from the center to the edge of the organization; and company executives will make Big Data and analytics as part of their enterprises’ strategy, which shall permit big data professionals to take up new roles and create business growth for corporate excellence.

Weitere Beiträge

Bei Interesse an einer Partnerschaft, kontaktiere uns gern!