My Transformative Journey through a 3-Month #devops Course


Embarking on a three-month-long DevOps course has been an enlightening and transformative experience for me. While I have successfully completed the course and actively participated in few projects, I firmly believe that there is still a vast amount to learn and explore in the world of #devops . At its core, DevOps is built upon the principles of continuous integration and continuous development, with a crucial emphasis on continuous learning. With a multitude of tools and technologies available, it becomes essential to understand the “why” and “what” behind their implementation. In this blog post, I will share my journey and the key tools and concepts I encountered during the course.

Exploring the Tools and Technologies:

Throughout my DevOps course, I delved into a variety of tools and technologies that form the backbone of this field. Here are some notable ones:

RHEL / Bash Scripting (Operating System):

Understanding the fundamentals of operating systems, particularly Red Hat Enterprise Linux (RHEL), and acquiring proficiency in Bash scripting provided a strong foundation for my DevOps knowledge. These skills are indispensable for effectively managing and automating tasks within the DevOps workflow.

Ansible (Configuration Management Tool):

As a powerful configuration management tool, Ansible enabled me to automate the provisioning, configuration, and deployment of infrastructure. Its simplicity and agentless architecture made it an ideal choice for managing large-scale environments efficiently.

AWS (Cloud Computing):

Cloud computing lies at the heart of modern IT infrastructure, and Amazon Web Services (AWS) is a leading provider in this domain. Through hands-on experience with AWS, I gained insights into deploying, scaling, and managing applications in the cloud.

Terraform (Infrastructure as Code):

Infrastructure as code revolutionizes infrastructure management, and Terraform emerged as a widely adopted tool in this space. I learned how to define and deploy infrastructure resources programmatically, ensuring consistency and scalability.

Jenkins (Continuous Integration Tool):

Continuous integration is a crucial practice in DevOps, and Jenkins played a vital role in automating build, test, and deployment processes. By utilizing Jenkins, I experienced the benefits of collaborative development and reduced manual efforts.

Prometheus & Grafana (Monitoring Tools):

Effective monitoring is essential for maintaining the health and performance of applications and infrastructure. Tools like Prometheus and Grafana provided me with valuable insights into monitoring and visualization techniques, enabling me to identify bottlenecks and optimize performance.

ELK (Log Aggregation):

Log aggregation plays a significant role in troubleshooting and analyzing system behavior. Through the ELK stack (Elasticsearch, Logstash, and Kibana), I learned how to centralize, index, and visualize logs effectively, improving observability within the DevOps workflow.

Azure DevOps (SaaS Platform):

Azure DevOps, a comprehensive software-as-a-service (SaaS) platform by Microsoft, offers an end-to-end toolchain for developing and deploying software. Exploring Azure DevOps broadened my understanding of the DevOps lifecycle and its seamless integration with various tools and services.

Deployment Strategies:

Understanding different deployment strategies, such as blue-green deployments, became crucial for ensuring reliable and resilient applications. During the course, I learned about these strategies and their implementation to minimize downtime and ensure smooth releases.

Docker & Kubernetes (Containerization & Orchestration):

Containers and orchestration are essential aspects of modern application development and deployment. Through Docker and Kubernetes, I gained hands-on experience in building, packaging, and deploying applications within a containerized environment, while effectively managing their orchestration.

Embracing Agile Principles:

In addition to the vast array of tools and technologies, my DevOps course also emphasized the importance of Agile principles. Concepts such as sprint planning

What is Azure Key Vault and how to configure it with the dotnet core application

What is Azure Key Vault?

Azure Key Vault is a cloud-based service offered by Microsoft Azure that provides secure storage for keys, secrets, and certificates. It allows users to create and manage cryptographic keys and secrets used by cloud applications and services. Azure Key Vault enables users to store sensitive information such as passwords, connection strings, API keys, and certificates in a secure manner.

Azure Key Vault uses hardware security modules (HSMs) to provide enhanced security for cryptographic keys and secrets. HSMs are physical devices that are designed to securely store and manage cryptographic keys. Azure Key Vault also supports multi-factor authentication and access control policies to ensure that only authorized users and applications can access sensitive information stored in the Key Vault.

Azure Key Vault can be used to store a wide range of keys and secrets, including SSL/TLS certificates, cryptographic keys, passwords, and API keys. It can be accessed through a REST API or using SDKs for various programming languages. Azure Key Vault can also be integrated with other Azure services such as Azure Active Directory, Azure Functions, and Azure DevOps.

Configure Azure Key Vault with .Net Core 6 Application:

Step 1: Create an Azure Key Vault

First, you need to create an Azure Key Vault in your Azure subscription. Here are the steps:

  1. Log in to the Azure Portal and go to the Azure Key Vault page.
  2. Click the “+ Add” button to create a new Key Vault.
  3. Fill in the required information, such as the name, subscription, resource group, and region.
  4. Set the access policies for your Key Vault. This is where you define who has access to your Key Vault and what they can do with it. For example, you might want to allow a specific Azure AD user or group to access your Key Vault.

Step 2: Create a .NET Core 6 application

Next, you need to create a .NET Core 6 application. Here are the steps:

  1. Open Visual Studio 2022 or higher and create a new .NET Core 6 Console Application project.
  2. Install the Azure.Extensions.AspNetCore.Configuration.Secrets package by running the following command in the Package Manager Console:

Install-Package Azure.Extensions.AspNetCore.Configuration.Secrets
  1. Modify the Program.cs file to load configuration settings from the Azure Key Vault:

using Azure.Extensions.AspNetCore.Configuration.Secrets; using Azure.Identity; using Microsoft.Extensions.Configuration; var builder = new ConfigurationBuilder() .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true) .AddUserSecrets<Program>() .AddEnvironmentVariables(); builder.AddAzureKeyVault(new Uri("https://<your-key-vault-name>"), new DefaultAzureCredential()); var configuration = builder.Build(); // Use configuration values here

Note: Replace <your-key-vault-name> with the name of your Azure Key Vault.

Step 3: Grant access to the application

Finally, you need to grant access to the .NET Core 6 application to access the Azure Key Vault. Here are the steps:

  1. Go to the Access policies page of your Azure Key Vault.
  2. Click the “+ Add Access Policy” button to add a new access policy.
  3. Select the principal type. This is the identity that you want to grant access to.
  4. Select the permissions that you want to grant. For example, you might want to grant “Get” and “List” permissions to allow the application to read secrets from the Key Vault.
  5. Click the “Add” button to add the access policy.

That’s it! Your .NET Core 6 application should now be able to read configuration settings from the Azure Key Vault.

Azure Key Vault Troubleshoots:

There could be a few different reasons why your Azure Key Vault connection is randomly failing

  1. Ensure that the Azure Key Vault access policies are correctly configured to allow access from your application. Make sure that the application’s identity (e.g. managed identity or service principal) is included in the access policies with the appropriate permissions.
  2. Check your application’s code to ensure that it is handling authentication and authorization correctly. Make sure that it is using the appropriate credentials to authenticate with Azure AD and obtain a token to access the Key Vault.
  3. Check your network connection to ensure that there are no intermittent issues that could be causing the connection to fail. For example, if you are using a VPN or a firewall, make sure that the appropriate ports are open and that there are no network connectivity issues.
  4. Check the Azure Key Vault logs to see if there are any errors or warnings that could provide more information about the issue. You can use the Azure Portal or the Azure CLI to view the logs.
  5. If the issue persists, consider contacting Microsoft support for further assistance. They can help you troubleshoot the issue and identify the root cause.

special thanks to my Friend


how to setup nginx on aws and pull the code from GITHUB , static code

This article is to describe the quick steps to configure the static website on the nginx stack running on AWS EC2.

to know about nginx please visit ,

Select amzong linux AMI for the image

i aI am using t2.micro for testing purpose

to save my money i have selected the spot instance

you can create EC2 without the key and continue with default user id and password

you need wait for a while till your instance is full initialed

once the EC2 is ready connect using browser based ssh , you can also use tool like putty

Run the following command to install the nginx on the server

#sudo amazon-linux-extras install nginx1

to check status nginx

# systemctl status nginx.service

to enable nginx service

#systemctl enable nginx.service

to start the service

#systemctl Start nginx.service

make sure the status is running

browe the public ip address and make sure nginx welcome screen is visable

now its time to pull your code

usally default webpage is running from the following location, in these test i will use same location to replace with my sample source code (default location /usr/share/nginx/html)

first i will remove the default webpage with help of

#rm -rf *

to confirm you can try to access with the same public ip adderss and you will see that default page is not there

for testing purpose i am using Microsoft demo html page

install package to run the git command on the linux

# sudo yum install git

naviate to /usr/share/nginx/html and clone the code from the git

# sudo git clone

after the clone, you can check # ls to make sure code is pulled well ,

# sudo mv project-html-website/* .

run above command to mv the files on the correct location

onces copy is done you can try to access new website from the public ip address

now you can point your A record to public address


What is Joud Cloud & Services

What is Joud Cloud :- 

Cloud computing is the delivery of computing services – compute power, storage, applications and other IT Resources –(“Cloud Services”)on demand over the internet (“The Cloud”). Companies offering Cloud Services are called Cloud Service Providers and they charge for these services based on usage, similar to how you are billed for electricity and water at home.

Cloud Computing continues to disrupt the traditional work of IT and transform the way companies of all verticals and sizes are doing business. Moving to the cloud provides companies with more agility, elasticity, cost saving and the ability to expand their business footprint and to develop a global reach towards their customers.

 What Services Joud Cloud Offers :-

JOUD Cloud Services is a secure cloud services platform with a broad set of services such as computing power, storage, business continuity and information security to help businesses scale and grow, all delivered as a utility: on-demand, available in minutes, with a pay-as-you-go pricing model.

With JOUD Cloud Services, you don’t need to make large upfront investments in hardware and software licensing, to run your business applications neither go through the hustle of spending a lot of time on managing them, instead, you can provision and access exactly the right type and size of computing resources you need, almost instantly, and only pay for what you use

Information Managment :- 

  • Compute and Storage
  • Backup and Recovery
  • Disaster Recovery
  • Enterprise File Sharing

Information Security :- 

  • Firewall
  • End Point Protection
  • Email Security
  • WAF
  • Advanced Threat Protection
  • SIEM

Partner with:-

  • Fortinet
  • Nutanix
  • veeam
  • Rubrik
  • Carbonite


To know more :-



Azure Slack Channels

Reshare :-


Always a great way for open discussions and sharing information!

Ask an Azure Architect Slack –

“We created this slack channel in an effort to foster a community, old and new, of users, customers, partners and employees who are using Azure. Here we share best practices, help work through designs and questions, share relevant content and help each other out.”

Azured Slack –

“For Azure flavoured chat, help, support and occasional banter”

Azure Stack Slack –

“Azurestackblog is a slack team for the Azure Stack community”

Build Azure Slack –

“A global channel to discuss all things Azure related”

Azure DevOps Club –

“A community of friendly professionals talking about all things Azure Devops (formerly known as VSTS and TFS)”

Azure Developers Slack –

“Slack channel for Azure Developers”

PowerShell Slack –

“Slack for all things PowerShell”

Have I missed an active Azure related Slack channel? Give me a shout on Twitter and I will add the channel to this list!