How To Build A Scalable Web Application: Difference between revisions

(value)
 
m (value)
 
Line 1: Line 1:
Know about the most trending easy to learn tech stack that powers top enterprises, handles massive amount of data, large scale and realtime web applications. Almost everybody has come across popular large scale enterprise platforms like Medium, Netflix, Coinbase, Uber and so on and often in day-to-day life many of us come across these technologies. There are many languages, frameworks and database systems which are powering these platforms. You have seen that these apps perform flawlessly and you know that millions of people everyday access these technologies yet they are able to serve each one of their customers without any friction. The technology stack these modern large scale enterprises use are today’s most promising and programmers having these skills are highly demandable. This is the most trending technology stack today and is being learnt by most of the software developers. Today we are talking about Node.js, MongoDB, Redis and Nginx and this is the most desired technology stack having a great future scope. In this article you will get a brief overview about how to build a scalable web application and find out that these are relatively easy to learn frameworks.<br><br><br><br><br><br>Estimated traffic handled by these platforms<br><br>We are talking about petabytes of data transfer some of these platforms are handling, especially the OTT platforms like Netflix. These applications are realtime and scalable to whatever the demand is, a user today won’t like to wait for a single extra second of load time. They need to be redundant and failover recoverable to hardware and platform fatalities. A single user on an average uses upto 1GB of data transfer in about an hour of time streaming their favourite TV show and considering about over 200 million subscribers of Netflix you can only imagine the kind of data transfer it provides in a month as well as simultaneously each second. Considering 200 million user streaming even about 5 hour time on a week, Netflix is serving  200000000(200 million users) x 5(5 hour of data = 5GB, 1GB per hour) x 4weeks, about an estimated 4000 Petabytes of data to all of the users each month! That is 400000000 GB of data. Enormous isn’t it? If you host such applications on Amazon AWS or GCP and use that much data, well imagine how much money for Outbound Data Transfer you will be paying, likely about 100 million dollars a month just for data transfer. Anyways now you can have a kind of wireframe in your mind of how powerful and scalable Node.js is and why it is such a demanding skill.<br><br><br>Applications like Coinbase which has to provide real time LIVE market data, you won’t like to notice a price change about your favourite cryptos about a second later maybe because you have placed an option or futures contract or are making regular buy/sell operations. Here comes the performance of MongoDB, the most rapidly evolving NoSQL database, you make an update operation in Australia it gets immediately reflected in America in less than a second without you needing to refresh the app. This is the power of MongoDB change streams and replica sets. MongoDb was developed considering the increasing number of database query operations and the amount of data that was rapidly increasing which was the case mostly observed  during the late 2000’s when the size of internet and content per web application/website was increasing dramatically. This is the time of rise of the video/image content typically applications like YouTube Facebook where video/image content was starting to become more popular which resulted in a lot of unstructured datasets and data which were not related to each other(Video/Image is not a data that is stored in a database, typically the metadata about the video/image like what location in the drive it is stored, what is its label, date it was created or modified, what are the tags related to such label, who is the owner of the data, where the data was shared and with which user, relation between the person and owner of the data, any location tagged with such data and many more use cases of data) typically several hundreds of such kind of information which are either related or not related were needed to be handled. While SQL databases can also handle such kinds of data sets it will become a much more complex kind of database and it will become a lot more difficult to query the kind of data which will be required. With such a rise in complexity in the data software developers had a rising concern of managing such large unstructured datasets and the costs that would typically rise in managing such operations.<br><br>Mapping technologies like Uber and Food Delivery platforms need to update data dynamically. A rider who is constantly changing locations and in cases where the vehicle is being driven at higher speeds like 50 miles per hour and still the precise location of the user is being dynamically updated. This is how the performance is delivered flawlessly by these technologies. A REST API and backend powered by nodeJS will  deliver such performances flawlessly and with less code as compared to other backend API’s.<br><br>Backed by turbo charged caching abilities<br><br>A Redis database will cache the database responses which were generated by a MongoDB query which will result in the application to dramatically increase the performance quality. An application which has a lot of IOPS(Input output operations per second) will definitely need to cache some of its query responses otherwise a trip inside to the database will result in an increase in the load of the server. A large scale server will efficiently balance the load and traffic to its server but if a data which was queried a few minutes ago and is again queried again would result in unnecessary loads over the backend. An example of which is consider a case where you want to know the phone number of your Taxi owner which is a static data(the phone number will not change within a few minutes or during an existing ride) you saw the number and called it when your taxi was initially booked, but after 4-5  minutes you need to know where they are so you want to call them again so this is a same query operation into the database and with this data being cached in Redis there won’t be unnecessary load on the server.<br><br>Node.js an exceptionally brilliant real time scalable video communications server<br><br>WebRTC(web Real time communications) which was developed to support peer to peer video communications over the internet is most easy to implement in Node.js considering the large support Node.js provides for WebRTC. With sockets in Node.js you can build an on demand multiple simultaneous video communications platform in just a few hours. An example of that would be people meeting over video regardless of they are in android or iOS or a Windows PC, you can develop an optimised performance server catering to these needs easily with Node.js and skip writing tons of code as in case with other frameworks.<br><br>The harmony in synchronization<br><br>You can launch a cluster of node instances for your server and Node.js can handle large amounts of concurrent requests. You can mix a cluster of Node.js instances with a cluster of nginx instances and mongoDB replica Sets. Consider a database that is in Miami Florida, a copy in Mumbai India, a copy in Singapore, and one in Australia with MongoDB change streams in replica sets. A change in data in any one of these regions will update the database in the rest of the locations. A 16 core 32 thread CPU will be able to load balance the incoming requests to 32 such instances of the server. And with 32 logical cores you can launch the same number of Node.js instances which will be able to match the concurrent incoming requests in synchronization with Nginx instances.<br><br>A promising technology for upcoming needs<br><br>Due to such dramatic performance capabilities of a mix between these 4 technologies it is among the most demandable tech skills as a programmer something very much similar to the demand of skills such as AI, ML and Deep Learning. And you will be amazed to know that the learning curve isn’t that large at all, In-fact one can easily start their career in programming with this stack and the time and efforts to learn this stack can be in-fact much less than your traditional first programming language.<br><br>If you liked this article therefore you would like to receive more info about How to remain completely anonymous online kindly visit the page.
Know about the most trending easy to learn tech stack that powers top enterprises, handles massive amount of data, large scale and realtime web applications. Almost everybody has come across popular large scale enterprise platforms like Medium, Netflix, Coinbase, Uber and so on and often in day-to-day life many of us come across these technologies. There are many languages, frameworks and database systems which are powering these platforms. You have seen that these apps perform flawlessly and you know that millions of people everyday access these technologies yet they are able to serve each one of their customers without any friction. The technology stack these modern large scale enterprises use are today’s most promising and programmers having these skills are highly demandable. This is the most trending technology stack today and is being learnt by most of the software developers. Today we are talking about Node.js, MongoDB, Redis and Nginx and this is the most desired technology stack having a great future scope. In this article you will get a brief overview about how to build a scalable web application and find out that these are relatively easy to learn frameworks.<br><br><br><br><br><br>Estimated traffic handled by these platforms<br><br>We are talking about petabytes of data transfer some of these platforms are handling, especially the OTT platforms like Netflix. These applications are realtime and scalable to whatever the demand is, a user today won’t like to wait for a single extra second of load time. They need to be redundant and failover recoverable to hardware and platform fatalities. A single user on an average uses upto 1GB of data transfer in about an hour of time streaming their favourite TV show and considering about over 200 million subscribers of Netflix you can only imagine the kind of data transfer it provides in a month as well as simultaneously each second. Considering 200 million user streaming even about 5 hour time on a week, Netflix is serving  200000000(200 million users) x 5(5 hour of data = 5GB, 1GB per hour) x 4weeks, about an estimated 4000 Petabytes of data to all of the users each month! That is 400000000 GB of data. Enormous isn’t it? If you host such applications on Amazon AWS or GCP and use that much data, well imagine how much money for Outbound Data Transfer you will be paying, likely about 100 million dollars a month just for data transfer. Anyways now you can have a kind of wireframe in your mind of how powerful and scalable Node.js is and why it is such a demanding skill.<br><br><br>Applications like Coinbase which has to provide real time LIVE market data, you won’t like to notice a price change about your favourite cryptos about a second later maybe because you have placed an option or futures contract or are making regular buy/sell operations. Here comes the performance of MongoDB, the most rapidly evolving NoSQL database, you make an update operation in Australia it gets immediately reflected in America in less than a second without you needing to refresh the app. This is the power of MongoDB change streams and replica sets. MongoDb was developed considering the increasing number of database query operations and the amount of data that was rapidly increasing which was the case mostly observed  during the late 2000’s when the size of internet and content per web application/website was increasing dramatically. This is the time of rise of the video/image content typically applications like YouTube Facebook where video/image content was starting to become more popular which resulted in a lot of unstructured datasets and data which were not related to each other(Video/Image is not a data that is stored in a database, typically the metadata about the video/image like what location in the drive it is stored, what is its label, date it was created or modified, what are the tags related to such label, who is the owner of the data, where the data was shared and with which user, relation between the person and owner of the data, any location tagged with such data and many more use cases of data) typically several hundreds of such kind of information which are either related or not related were needed to be handled. While SQL databases can also handle such kinds of data sets it will become a much more complex kind of database and it will become a lot more difficult to query the kind of data which will be required. With such a rise in complexity in the data software developers had a rising concern of managing such large unstructured datasets and the costs that would typically rise in managing such operations.<br><br>Mapping technologies like Uber and Food Delivery platforms need to update data dynamically. A rider who is constantly changing locations and in cases where the vehicle is being driven at higher speeds like 50 miles per hour and still the precise location of the user is being dynamically updated. This is how the performance is delivered flawlessly by these technologies. A REST API and backend powered by nodeJS will  deliver such performances flawlessly and with less code as compared to other backend API’s.<br><br>Backed by turbo charged caching abilities<br><br>A Redis database will cache the database responses which were generated by a MongoDB query which will result in the application to dramatically increase the performance quality. An application which has a lot of IOPS(Input output operations per second) will definitely need to cache some of its query responses otherwise a trip inside to the database will result in an increase in the load of the server. A large scale server will efficiently balance the load and traffic to its server but if a data which was queried a few minutes ago and is again queried again would result in unnecessary loads over the backend. An example of which is consider a case where you want to know the phone number of your Taxi owner which is a static data(the phone number will not change within a few minutes or during an existing ride) you saw the number and called it when your taxi was initially booked, but after 4-5  minutes you need to know where they are so you want to call them again so this is a same query operation into the database and with this data being cached in Redis there won’t be unnecessary load on the server.<br><br>Node.js an exceptionally brilliant real time scalable video communications server<br><br>WebRTC(web Real time communications) which was developed to support peer to peer video communications over the internet is most easy to implement in Node.js considering the large support Node.js provides for WebRTC. With sockets in Node.js you can build an on demand multiple simultaneous video communications platform in just a few hours. An example of that would be people meeting over video regardless of they are in android or iOS or a Windows PC, you can develop an optimised performance server catering to these needs easily with Node.js and skip writing tons of code as in case with other frameworks.<br><br>The harmony in synchronization<br><br>You can launch a cluster of node instances for your server and Node.js can handle large amounts of concurrent requests. You can mix a cluster of Node.js instances with a cluster of nginx instances and mongoDB replica Sets. Consider a database that is in Miami Florida, a copy in Mumbai India, a copy in Singapore, and one in Australia with MongoDB change streams in replica sets. A change in data in any one of these regions will update the database in the rest of the locations. A 16 core 32 thread CPU will be able to load balance the incoming requests to 32 such instances of the server. And with 32 logical cores you can launch the same number of Node.js instances which will be able to match the concurrent incoming requests in synchronization with Nginx instances.<br><br>A promising technology for upcoming needs<br><br>Due to such dramatic performance capabilities of a mix between these 4 technologies it is among the most demandable tech skills as a programmer something very much similar to the demand of skills such as AI, ML and Deep Learning. And you will be amazed to know that the learning curve isn’t that large at all, In-fact one can easily start their career in programming with this stack and the time and efforts to learn this stack can be in-fact much less than your traditional first programming language.<br><br>Here's more on Artificial Intelligence look at the web site.

Latest revision as of 20:50, 20 July 2023

Know about the most trending easy to learn tech stack that powers top enterprises, handles massive amount of data, large scale and realtime web applications. Almost everybody has come across popular large scale enterprise platforms like Medium, Netflix, Coinbase, Uber and so on and often in day-to-day life many of us come across these technologies. There are many languages, frameworks and database systems which are powering these platforms. You have seen that these apps perform flawlessly and you know that millions of people everyday access these technologies yet they are able to serve each one of their customers without any friction. The technology stack these modern large scale enterprises use are today’s most promising and programmers having these skills are highly demandable. This is the most trending technology stack today and is being learnt by most of the software developers. Today we are talking about Node.js, MongoDB, Redis and Nginx and this is the most desired technology stack having a great future scope. In this article you will get a brief overview about how to build a scalable web application and find out that these are relatively easy to learn frameworks.





Estimated traffic handled by these platforms

We are talking about petabytes of data transfer some of these platforms are handling, especially the OTT platforms like Netflix. These applications are realtime and scalable to whatever the demand is, a user today won’t like to wait for a single extra second of load time. They need to be redundant and failover recoverable to hardware and platform fatalities. A single user on an average uses upto 1GB of data transfer in about an hour of time streaming their favourite TV show and considering about over 200 million subscribers of Netflix you can only imagine the kind of data transfer it provides in a month as well as simultaneously each second. Considering 200 million user streaming even about 5 hour time on a week, Netflix is serving 200000000(200 million users) x 5(5 hour of data = 5GB, 1GB per hour) x 4weeks, about an estimated 4000 Petabytes of data to all of the users each month! That is 400000000 GB of data. Enormous isn’t it? If you host such applications on Amazon AWS or GCP and use that much data, well imagine how much money for Outbound Data Transfer you will be paying, likely about 100 million dollars a month just for data transfer. Anyways now you can have a kind of wireframe in your mind of how powerful and scalable Node.js is and why it is such a demanding skill.


Applications like Coinbase which has to provide real time LIVE market data, you won’t like to notice a price change about your favourite cryptos about a second later maybe because you have placed an option or futures contract or are making regular buy/sell operations. Here comes the performance of MongoDB, the most rapidly evolving NoSQL database, you make an update operation in Australia it gets immediately reflected in America in less than a second without you needing to refresh the app. This is the power of MongoDB change streams and replica sets. MongoDb was developed considering the increasing number of database query operations and the amount of data that was rapidly increasing which was the case mostly observed during the late 2000’s when the size of internet and content per web application/website was increasing dramatically. This is the time of rise of the video/image content typically applications like YouTube Facebook where video/image content was starting to become more popular which resulted in a lot of unstructured datasets and data which were not related to each other(Video/Image is not a data that is stored in a database, typically the metadata about the video/image like what location in the drive it is stored, what is its label, date it was created or modified, what are the tags related to such label, who is the owner of the data, where the data was shared and with which user, relation between the person and owner of the data, any location tagged with such data and many more use cases of data) typically several hundreds of such kind of information which are either related or not related were needed to be handled. While SQL databases can also handle such kinds of data sets it will become a much more complex kind of database and it will become a lot more difficult to query the kind of data which will be required. With such a rise in complexity in the data software developers had a rising concern of managing such large unstructured datasets and the costs that would typically rise in managing such operations.

Mapping technologies like Uber and Food Delivery platforms need to update data dynamically. A rider who is constantly changing locations and in cases where the vehicle is being driven at higher speeds like 50 miles per hour and still the precise location of the user is being dynamically updated. This is how the performance is delivered flawlessly by these technologies. A REST API and backend powered by nodeJS will deliver such performances flawlessly and with less code as compared to other backend API’s.

Backed by turbo charged caching abilities

A Redis database will cache the database responses which were generated by a MongoDB query which will result in the application to dramatically increase the performance quality. An application which has a lot of IOPS(Input output operations per second) will definitely need to cache some of its query responses otherwise a trip inside to the database will result in an increase in the load of the server. A large scale server will efficiently balance the load and traffic to its server but if a data which was queried a few minutes ago and is again queried again would result in unnecessary loads over the backend. An example of which is consider a case where you want to know the phone number of your Taxi owner which is a static data(the phone number will not change within a few minutes or during an existing ride) you saw the number and called it when your taxi was initially booked, but after 4-5 minutes you need to know where they are so you want to call them again so this is a same query operation into the database and with this data being cached in Redis there won’t be unnecessary load on the server.

Node.js an exceptionally brilliant real time scalable video communications server

WebRTC(web Real time communications) which was developed to support peer to peer video communications over the internet is most easy to implement in Node.js considering the large support Node.js provides for WebRTC. With sockets in Node.js you can build an on demand multiple simultaneous video communications platform in just a few hours. An example of that would be people meeting over video regardless of they are in android or iOS or a Windows PC, you can develop an optimised performance server catering to these needs easily with Node.js and skip writing tons of code as in case with other frameworks.

The harmony in synchronization

You can launch a cluster of node instances for your server and Node.js can handle large amounts of concurrent requests. You can mix a cluster of Node.js instances with a cluster of nginx instances and mongoDB replica Sets. Consider a database that is in Miami Florida, a copy in Mumbai India, a copy in Singapore, and one in Australia with MongoDB change streams in replica sets. A change in data in any one of these regions will update the database in the rest of the locations. A 16 core 32 thread CPU will be able to load balance the incoming requests to 32 such instances of the server. And with 32 logical cores you can launch the same number of Node.js instances which will be able to match the concurrent incoming requests in synchronization with Nginx instances.

A promising technology for upcoming needs

Due to such dramatic performance capabilities of a mix between these 4 technologies it is among the most demandable tech skills as a programmer something very much similar to the demand of skills such as AI, ML and Deep Learning. And you will be amazed to know that the learning curve isn’t that large at all, In-fact one can easily start their career in programming with this stack and the time and efforts to learn this stack can be in-fact much less than your traditional first programming language.

Here's more on Artificial Intelligence look at the web site.

Cookies help us deliver our services. By using our services, you agree to our use of cookies.

Need wiki hosting?

Do you need a wiki for your Minecraft mod/gaming wiki? We'll host it for free! Contact us.

Other wikis

Indie-game wikis
Powered by Indie Wikis