Data deduplication is a technique to solve the common problem in distributed software architecture whereas a specific data record is duplicated unintentionally in a distributed, non-locking table like DynamoDB.

The stateless function as a service is the greatest architecture toward the future of the software development industry. Well-architectured it right…

In Design Thinking, testing your solution by giving it to real users is the only way to evaluate your idea. If your system doesn’t support the ability to serve different users with different content, your system is not A/B Testing readiness.

From Wikipedia — “A/B testing is a way to compare two versions of a single variable, typically by testing a subject’s response to variant A against variant B, and determining which of the two variants is more effective.”

In a Startup, you are running an e-commerce website that offers clothes…

Random Generator Number is the core component to generate the primary number in OpenSSL. Understanding how does it work in a real-life implementation is very important for the security aspect.

In computer security, crypto libraries are widely used from basic levels such as password protection algorithms to secured or encrypted communication channels or data encryption methods. Understanding the weakness of algorithms is important to choose the right crypto configuration when developing software or building a networking system. In asymmetric cryptography…

A problem will change the characteristics when eliminating them, it happens with the causal effect. The more precise characteristics you define, the less impact on the derived problem. Therefore understanding a wicked problem is important to reduce the causal effect.

According to Horst Rittel (1930–1990), a design theorist and university professor first coined the term “wicked problem” in ‘Dilemmas in a General Theory of Planning’ (1973). In the paper, Rittel details ten characteristics that describe a wicked problem. This is the 10 Characteristics of Wicked Problems defined in that paper:

Scalability and elasticity are one of five essential properties in Cloud Computing technology. According to The National Institute of Standards and Technology (NIST) defines cloud computing as it is known today through five particular characteristics: On-demand self-service, broad network access, multi-tenancy and resource pooling, rapid elasticity and scalability, measured service.

What is Scalability?

In Data Science, a productive model is an amazing work from you. You want to sell your excellent model to a Financial Business or a potential StartUp through the API subscription method. AWS Lambda and API Gateway will be the best choice for you. Deploying your model into production and scale-out them for thousands of thousands of users is almost done if it works on Serverless architecture. Doesn’t like deploying on an EC2, it costs you nothing if there is no usage.

There are three popular frameworks using Python: SciKit-Learn, TensorFlow, and PyTorch. Today, TensorFlow is used widely for Deep Learning along with PyTorch while SciKit-learn is used for Machine Learning at most. In Mar 2019, Google has released a Tensorflow Lite version (TFLite). The lite version intends to use for Smart…

CuongQuay

(MSc) Cloud Security | Simplify Framework Creator | FinTech CTO | Technical Startup Booster (Consultant).

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store