Skip to main content

Posts

Showing posts from 2017

Why you should never put a junior in a team that is stressed and bad-mannered

It is nothing abnormal for a team to be stressed. During that times they might not have time for nothing else. The focus of the team is to deliver the business requirements in the IT solution. Unfortunately, this time periods can be longer than expected, even if we know that this is not good.
I saw teams being more than 18 months in a phase like this. After a while in this state, they don’t even realize that they are in this state and what is the impact and project and people level.
In this post, I will try to focus on the impact that such a phase can have at juniors and mid-levels in unhealthy teams.

Why?
When you are a junior you are in moment of you carrier when you want to learn. You know that you have things to learn, you are usually a fresh graduate with good theory knowledge and you want to put this in practice.
I like to compare smart juniors with birds that have big and powerful wings, but they don't know to fly very well yet. They can reach the sky and accomplish many thi…

Azure Blob Storage - More storage and throughput

One of the core services of Microsoft Azure is Azure Storage that is used to store binary content, key-values pair (Azure Tables)  or message queue (Azure Queues). In today's post, we will discover how a small change in Azure Storage capabilities is changing our life and simplify our IT solutions.

Current solutions
The current maximum capacity of Azure Blob Storage used to be 500TB. Even if this might sounds a lot, there are multiple cases when you had to overcome this limits. If you have a system where devices and users are uploading content to your system, than you can reach easily 2-5TB per day that would force you to use a new Azure Storage account every 3 months.
To overcome this limitation, your solution needs to be able to manage Azure Storage accounts automatically. Besides being able to clean and archive content automatically, you will need a system that can create a storage account on the fly and redirect traffic to it. When you use multiple Storage Account, you are force…

Less than 1 week until Microsoft Ignite 2017

First time when I took part at Microsoft TechEd in 2012 in Amsterdam. It was one of my first conferences with more than 5k attendees. It was a wow, from all perspective. From then I participate to each TechEd and Microsoft Ignite.
At Microsoft Ignite, attendees have the opportunity not only to learn and discover new stuff, but also to meet people all around the globe. It is that week in the year when you can meet face to face Program Managers from Microsoft together with people that you talk over Twitter from Japan, Australia, UK and USA in only one place.

This year things will be a little different. It will be the first time when I participate at Microsoft Ignite not as attendee, but also as speaker. It is a joy to be invited to speak at a conference with more than 23000 attendees. If this is not enough, I will have 3 sessions where I will share my knowledge and experience related to IoT, security and NoSQL. If you want to find more about this subjects feel free to join my sessions …

Is security and data privacy important on tracker devices like Fitbit?

A few days ago, I read about how insecure Fitbit devices are. There was a lot of noise created around it, explaining different ways how you can hack Fitbit device to gain access to personal data. My first reaction when I saw the title of article was “So what!?” and let me explain why I don’t see this a life treating or something that will stop me to use my Fitbit.

Personal data
It is true that a tracker contains personal data, but let us be realistic and look on what data it has. Most of the trackers contains information related to your past activity, heartbeat, number of steps and in some cases GPS information.

Except GPS information, the rest of the data are not so sensitive. What do you think that a hacker can do if he knows that you done 10k steps this morning. Yes, he might know your habits and broke into your house when you are jogging or walk the dog. This scenario can be real, but the true is that there are so many ways to find out what are your habits that you would be impress…

The scope of a PoC

Let us talk about what it should be the scope of a PoC and what you should or you should not have in a PoC.

Purpose of PoC
First, we need to define what is the purpose of a PoC is. The main purpose is to demonstrate the principles that are covered in technical documents (that it is not just theory and diagrams).

Reusability
It is already a deja vu for me to hear people that they want to reuse the PoC output in the main project. This happens because many times the PoC scope is too big and does not covers only the ideas that needs to be demonstrated.
When you have a PoC that covers more than 15% of the implementation effort than you might have a problem. That is not a PoC anymore, it is a PILOT, that represents a system with a limited functionality that go in production. The Pilot might have many restrictions, from NFRs to business use cases that are covered, but it has some part that works.
You will never want to invest in a PoC more than it is necessary and you shall always push the ou…

Containerization without a microservices approach

The current trends are clear. We should develop software applications using only microservice approach. This sounds goods for new application, where system requirements guides us to go with a microservice approach.
But what happens for the other types of systems. We might need to develop a normal web application, with some backend processing behind it. No crazy NFR, no need to scale to 100.000 RPS or similar stuff.

Monolithic application
As an example let us imagine that, we need to develop a web application that resize our pictures to Instagram size (1x1). There are no special requirements related to availability or scalability and the load on the system is a low. The system is used just by our employees (less than 5.000) for company images that needs to be published on commercial web sites.
Of course, we can imagine a state of the art microservice implementation, with different services that scale by themselves. What if we do not need something like this, but is very appealing for us…

List of IPs used by each Azure Resource (service)

It is not uncommon to configure the firewall and other security and control mechanism like User Defined Routes (UDR) and NGA (Network Security Groups) to restrict access to your Azure Resources. In the moment when we want to do such a thing we need to know the IPs that are used by Azure Infrastructure.

Let’s take as example a web application that is hosted inside App Service (using VNETs, Traffic Manager, Azure Storage, Azure SQL and many more). To be able to properly configure the access rules, we need to know what are the IPs used by Azure Storage and Azure SQL in that region, Traffic Manager IPs used for probing and so on.

Azure Region IP Range
Most of this information can be found in a XML provided by Microsoft (https://www.microsoft.com/en-us/download/details.aspx?id=41653), but I expect that this will not enought. You’ll find inside the document the IP ranges that are used by each Azure Region, but without a tag that specify what IP ranges are used by each Azure Resource it is to…

Is RDP connection open by default for VMs inside Azure?

I saw on Twitter a discussion related to Azure VMs and RDP connection that are open by default. The main purpose of this topic is to present different use cases when the RDP connection (not) is available by default.
Use Case 1: Single VM (VM with Public IP inside a default VNET) – RDP Active by default for public access In this context, we have a VM that is created from Azure Portal (or script) as a single entity. It is not part of any scale set or other type of custom configuration. It is just a simple Windows Server 2016 Datacenter machine, which is part of a default VNET with a Public IP allocated to it. In this case, by default the RDP will be configured. The default Network Security Group (NGS) that is created together with our VM will allow RDP connection to the machine. The default VNET allows RDP connection to our VM, because there are no custom NGS rules to restrict it and we have a Public IP attached to our VM.


Use Case 2: Single VM (VM without Public IP inside a default VNET…

Configure Traffic Manager and Web Apps over SSL (HTTPS) using custom domain

In this topic we will cover what we shall do when we:
Configure Azure Traffic Manager on top of Web Applications hosted inside App Services Over HTTPSWith custom domain Client certificatesContext
When we are using HTTPS in combination with App Services, everything will go smooth. You just need to activate the HTTPS and upload client certificate, if you want to use a custom one.
Things are a little different when you want to configure HTTPS on top of Traffic Manager. In theory, the steps are clear and it should work as expected, but combined this with custom domain and client certificates things can end up with a 404 error code..

Initial Setup
Pre requirements: Web Apps are configured over SSL using custom domain and works as expected.
Let’s take a look on the base steps that needs to be done when you want such a configuration

Create an instance of Traffic Manager inside Azure Portal and add your Web Apps that you already configured for HTTPSAdd your custom domain to your DNS Record – thi…

Azure Storage - Archive Tier | Perfect solution for store audit data

There are two new features of Azure Storage that will make the end of 2017 very interesting.

New Archiving Tier
A new archiving tier is now available for blob storage. Additional to Cool and Hot access tiers we have now Archive tier. In contrast to the existing ones, it was design for situations when you need to archive data for long periods.
An interesting fact, in comparison with Cool storage is related to SLA. The availability SLA is the same as for Cool storage – 99%, in the context of a tier that is it secure and durable as Cool storage, but it is much cost efficient. From what I see know is more than 5 times cheaper than Cool storage.

The new tier comes in hand-in-hand with the current trend of moving existing infrastructures to Azure. In many situations, there are cases when because of regulatory aspects, you need an archiving solutions for audit data.
Audit data needs to be stored for at least 5 years. With the current price of Hot and Cool tier, it was hard to do something lik…

.NET Core or .NET Framework together with Docker and Containers on Azure

In this post, I will try to find answers and guidance for technical people that needs to decide if they shall use .NET Core or .NET Framework inside Docker containers on Azure. The same answer is applicable for on-premises, because Docker runs in the same way on Azure or on-premises.

The directions from Microsoft related to this topic are clear. The default option shall be .NET Core. From this way, .NET Core is design in a such a way that is align with container concepts. For example the footprint was reduced drastically in comparison with .NET Framework.

One interesting fact that people do not know is related to the type of Windows image that you need to use when you use .NET Core or .NET Framework together with containers. When you use .NET Framework you need to use a Windows Server Core. This image is heavier than Windows Nano Server. This can have a direct impact at infrastructure and resources requirements. There are many things to say why Windows Nano Server is better, a few th…

Microsoft Azure MVP 2017-2018

Another year passed as Microsoft Azure MVP and I realize that this is the 6th year as Microsoft MVP. It is a pleasure to be part of MVP Community, that has extraordinary people that are ready to help and offer support to any community all around the world.

I’m honored and exited to be part of this great community for one more year!

IoT offer comparison: AWS vs Azure

Nowadays IoT is appealing for everyone. These opportunities made the two biggest cloud providers from the market (Amazon and Microsoft) to come up IoT platforms and solutions. The main purpose this this article is to compare the current solutions from features perspective and capabilities perspective.
The interesting thing that happened in the last few years is the way how IoT solution evolved. At the beginning the solutions were oriented around transport and communication, but now the IoT platforms evolved and are integrated with systems that runs on the edge and in the cloud, supporting the business needs.

X-Rays
Let’s take a look on the available solutions that Amazon and Microsoft offers. Both providers are offering a central Hub that is used to establish and facilitate a communication between devices and backend systems.
At device level, each provider is offering a set of package libraries that allows clients to integrate their devices with the communication platform faster. At ba…

Dynamic update of Azure Web Job time schedule

The topic of this post is simple:
How can I specify to an Azure Web Job the schedule using application configuration or another location?
Does the web job restart automatically in the moment when I change the time interval?

I decided to write about this, because even if you can find the solution on the internet, you need to invest some time on searching until you find the right documentation related to it (INameResolver).

Short version
A custom INameResolver is defined that based on a key name can read the configuration from any location. In the above example, the value is read from configuration file. Don’t forget to register the name resolver on JobHostConfiguration.
namespace WebJob.Schedule { class Program { static void Main(string[] args) { JobHostConfiguration config = new JobHostConfiguration(); config.NameResolver = new MagicResolver(); config.UseTimers(); JobHost host = new JobHost(config); host.RunAndBlock(); } private class …

Azure Data Lake and Replication mechanism cross Azure Regions

Context
Let’s imagine that we are working in an automotive company that collects telemetric data from their cars. A part of this data needs to be processed, stored and managed.
To be able to store data now and use it later on in time, you decided to go with Azure Data Lake, that is not limited on how much data you can store and allows you to plug any kind of processing system.
Requirements
After the architecture audit, because of legal constraints you are required to have a resiliency policy for disaster recovery. Even if in the same Azure Region there are 3 copies of data that are generated by Azure Data Lake, the legal constraints require you to have a resiliency policy.
Problem
Azure Data Lake makes 3 copies of data in the same Azure Region, but there is no support to replicated or backup content in a different Azure Region. You will need to define you own mechanism for this.

Available Solutions
We could fine many ways of doing this. There are 2-3 mechanism to do replication of Azure …

[Post-Event Event] Event Sourcing and CQRS | ITCamp Community Summer Lunch Event in Cluj

Today we had the first ITCamp Community Event during the lunch break. We decided to do this event at this time in day because it was the only available slot for our special guest Andrea Saltarello.
The talk was about CQRS and Event Sourcing and even if it was only for one hour, the session contained a lot of takeaways, not only from technical perspective, but also from costs and architecture point of view. A great comparison between different NoSQL and ESB systems was presented from Event Sourcing point of view.

There were almost 30 people that decided to transform their lunch to a geek lunch together with ITCamp Community. This event was possible with the support or our local sponsors, that made this event possible.


Below you can find pictures from the event. See you next time!




Azure Audit Logs and Retention Policies

Scope In today post we will talk about Azure Audit Logs and retention policies. Because retention policies might differ from one industry to another, different approaches are required.
Audit Logs From my past experience, I know that each company and department might understand a different thing when you say Audit Logs. I was involved in projects where when you tag a log as audit you would be required by law to keep the audit log for 20-25 years. In this context, I think that the first step for us is to define what is an Audit Log in Azure. In Azure, most of the audit logs can be an activity log or a deployment operation. The first one is close related to any write operation that happened on your Azure Resource (post, put, delete). Read operations are not considered activity logs – but don’t be disappointed, there are many Azure Services that provided monitoring mechanism for read operation also (for example Azure Storage). The second type of audits are the one generated during a dep…

[Community Event] Event Sourcing and CQRS | ITCamp Community Summer Lunch Event in Cluj

At the end of this month (July 24) we will have a special guest in Cluj-Napoca: Andrea Saltarello. The format of the event will be different from the previous ones. The event will take place during the lunch break at The Office and is free.
If you want to find more about the event you can check the following registration links. See you at the event.

Meetup: https://www.meetup.com/ITCamp-Community/events/241394189/
Eventbrite: https://www.eventbrite.com/e/event-sourcing-and-cqrs-itcamp-community-summer-lunch-event-in-cluj-tickets-35994003032
ITCamp Community blog: https://community.itcamp.ro/2017/07/itcamp-community-summer-lunch-event-cluj-event-sourcing-cqrs/

Official announcement:
Let's try a different kind of event this summer. I proposed to all of you to meet during the lunch break and have a talk about Event Sourcing and CQRS. There will be a special guest (Andrea Saltarello - Solution Architect at Managed Design) that will talk about his own experience on how we should manage …

Near real-time analytics for IoT Technician from the field - Azure Time Series Insights

Take a look around you and tell me if you see at least one smart devices capable to send data. There are big chances that you'll have around you more than one. In this moment I have around me a laptop, a Surface, my Fitbit, a SmartTV and a Raspberry PI that fully equipped with weather sensors.
You might say who cares about the data that are collected from them. Maybe nobody or just adds companies. If you would be on a production lines things would be different, you would like to be able to visualize this data from different perspective, analyze them and find way the production fluctuated in a specific day.

Timebound
Data that is collected from different sensors and devices can contain a lot of parameters like temperature, humidity, light and noise level. But in the end when we want to visualize this data, the time information will be the one that will be used on a chart to look at data.
Try to imagine a chart where you put only temperature and humidity information, excluding the ti…