Skip to main content

Posts

Showing posts from 2017

[Post Event] Endava Tech Flow, November 2017, Cluj-Napoca

This week I had the opportunity to participate at Endava Tech Flow. The main topic of this event was AI and Machine Learning. There were more than 100 people that participate to this free event.  Beside local speakers, we had the opportunity to meet J.M Bishop (Director of the Tungsten Centre for Intelligent Data Analytics and Professor of Cognitive Computing at Goldsmiths, University of London) that talked about how dangerous artificial intelligence can be for us.

I also had a short talk about Machine Learning, where I tried to explain what is behind a ML and how does a neural network works. More about my session can be found below. 

Title: The science behind Machine Learning
Abstract: Did you ever ask yourself how Machine Learning works as a system? In this session we will take a look at the science behind Machine Learning systems. We will decrypt the base mathematical concepts that make systems like these work. Once you attend this session you will discover how polynomial expression…

IoT Home Automation | Backend infrastructure

Even if in the previous post I said that I will write about the integration points with garage gates, I decided to go forward with the development of the web interface and gateway software. I went on this path because once I have this part implemented I can play with ESP8266 as much as I want to integrate with all the devices that I have around the house.



Web Interface (UI)
I decided to go with a simple design. I’m pretty sure that I will become more complex in the future, but for now I will keep things as simple as possible. The web interface is done using AngularJS 5 together with a REST API exposed using ASP.NET Core. For now the web interface expose 2 buttons that allows me to trigger actions (open/close gates). This web interface is hosted as a web application inside an App Services. The plans is to secure using Azure AD, but this is another story, another post, but for now there is no need to do this because there is real device in the backend (yet).

Communication
The communicatio…

[Post Event] ISTAConf 2017, Sofia

Just finished my session at ISTAConf 2017 and I realize that we are not only talking about NoSQL and migration  strategy, but we even doing that. I just met some great guys from Sofia that started to migrate their relational database to MongoDB. What was awesome that they have in plan a migration from on-premises to Azure in the next 3-6 months for their system, so Azure Cosmos DB is the perfect suite for them.

It is the 7 edition of ISTA Conference, that started in 2011. Last time when I was here was 2 years ago and in comparison with that times they grow a lot. Not only during the keynotes, but also during the sessions the rooms are full with people that are curious to find more about what are the current trends and what future is preparing for us. What I like at ISTA is the format of the conference. Even if there are more than 750 participants, they don't have more than 3 tracks. allowing them to keep the quality of session at a high level.

At this conference I talk about Azure…

[Post Event] Meetup in Sofia together with Bulgarian BI & .Net User Group, November 2017, Sofia

Last I had the great opportunity to be invited by the local user group from Sofia (Bulgarian BI & .Net (BI & .Net Geeks) UG) to talk about Azure and Security at one of their meetups. It was such a great experience, meeting smart people that are full with energy and open to new technologies and ideas.

You can find below slides and content related to my session. Title: "Azure Enterprise Security in Practice"  Abstract: "How does an enterprise looks like when you talk about security and cloud? Complicated, rigid and challenging to accept cloud based architectures. After working closely with security teams from different companies I identified different approaches and requirements that are standard for enterprises. In this session I want to discuss and share with you lessons learned on how we can map security requirements to Azure." Slides:
Azure Enterprise Security in Practice Radu Vunvulea Codecamp Cluj Napoca Nov 2017 from Radu Vunvulea

[Post Event] ITDays 2017, Cluj-Napoca

This week it was the 5th edition of ITDays in Cluj-Napoca. This is a conference started small and become of the most important conferences in Cluj-napoca that take place in the second part of the year.
During the 2 days of the event there were more than 40 sessions delivered by around 40 speakers. Additional to this there were a lot of products and startups presentations that make us believe that there is something more than just code (smile). I had the opportunity to be invited as speaker at this conference where I talked about near-real time solutions that can be developed using Microsoft and Azure tools. More about my session can be found below:
Title: Near-real time reporting in Azure Abstract: One of the most common requirement on a projects nowadays is real time monitoring and reporting. Easy to say, expensive to implement and complex to maintain. In this session we'll take a look on the Azure Services that enable us to fulfil this requirements with minimal effort and with maxi…

[Post-Event] Codecamp 2017, Cluj-Napoca

This was the last Codecamp in Cluj-Napoca for 2017. With more than 1000 registered attendees and 50 speakers it was a great way to close the year. There were 10 tracks in parallel that made possible to have 70 sessions of 45 minutes in only one day.
I had the opportunity to deliver two sessions at this event, where I talks about how you need to mitigate security when you migrate from on-premises to Azure and Azure CosmosDB. Below you can find my session details and slides.

Title: "Azure Enterprise Security in Practice"
Abstract: "How does an enterprise looks like when you talk about security and cloud? Complicated, rigid and challenging to accept cloud based architectures. After working closely with security teams from different companies I identified different approaches and requirements that are standard for enterprises. In this session I want to discuss and share with you lessons learned on how we can map security requirements to Azure."
Slides:

Azure Enterprise Sec…

IoT Home Automation | (1) Initial Overview

After playing a while with different IoT devices, I decided that it is a perfect moment to start to do some automation inside the house. Nothing special or fancy, just small things that improve the quality of life and force me to work on some hardware devices also.

What I want to achieve?
There are three different system that I want to connect and to be able to remote control and automate.
Garage doors and gates – I want to be able from my mobile phone to open or close the gates. In this moment, I have a remote control that control them. However, in the future I do not want to use them anymore, I want to replace them with only one device – my phone.
Alarm control – Usually when you close or open the gates you also need to arm or disarm the house alarm. Why not integrate this inside one system that would be able to open the gates and disarm the alarm automatically.
Automatically water system – Even if we love plants and flowers, it is a nightmare in the morning to water 20 minutes all the …

Lift and Shift - cloud migration strategy

During discussion on different cloud projects I observed that people are using “Lift and Shift” terminology with multiple meanings. This can create confusion between different parties, especially when technical team from each team understand a different thing.

What is Lift and Shift?
Lift and Shift is a migration strategy that is based on the concept of replication 1 to 1 of the environment that exist on-premises inside cloud (Microsoft Azure). This involves the migration of all computation, storage and other services without replacing them with specific Azure services.

What is not Lift and Shift?
When you have a File Server system on the current infrastructure. Lift and Shift in this case shall not include replacing it with Azure Files. Involves just taking the File Servers instances from on-premises and putting them inside Azure VMs.

Another good example is when you migrate a web farm. If you decide to do just a Lift and Shift, than you should just spin Azure VMs where you would use A…

Isolate Web Application from public internet (App Service)

In this post, we will talk about web endpoint security. Let us start from a basic requirement and see what we need to fulfil it.
The web application hosted inside App Services shall not be public available from internet.

Context
The requirement is simple and clear, but can give us headaches if the team does not cover it from the beginning. Microsoft Azure is offering two options to fulfil it:

IP RestrictionsApp Service integration with VNETVNET with dedicated App Service Environment

IP Restrictions
App services is allowing us to specify a list of IPs that can access our web application. The feature is similar with the IP restriction functionality offered by IIS and can be configured inside web.config.
The difference between this two is the location where these checked is done. Using IP Restrictions the check is done a layer before the IIS. Additional this, the configuration can be done from Azure Portal or using ARM templates. There is no need to modify the configuration file of your appl…

Just deleted an Azure SQL Database by mistake! What's next?

There are times when you make mistakes, big mistakes like…
…deleting an Azure SQL Database…
In this post, we will take a look on what are the steps that needs to be done to recover and restore a deleted database. We are in the context where you have a Standard database, without any special features related to backups.

Trigger
I just realized that I deleted the wrong database and I do not have any custom backups mechanism configured. What should I do?

Time is crucial 
Time is one of the most important factors. The backups of deleted databases is stored for a limited time. The time window depends based on the instance type. In this time window, you can restore a deleted database without any kind of problems. The retention policies for backups is 7 days for Basic and 35 days for Standard and Premium.
Azure SQL Server created automatically backups to your databases. These backups are used to restore a deleted database. Don’t forget that as for on-premises backups, things can go wrong during …

How to get your Azure Subscription Quotas and current Usage

In this post, we will talk about a simple problem inside Microsoft Azure:
How I can see what my quotas are for Azure Subscription that I am using?

Context
When you start to use you Azure Subscription more than for playing you realize after a few weeks that you do not know what your current quotas limits are. Additional to this, it is not easy to count the number of instances for each resource.
For example if you use multiple Azure Storage in different Resource Groups, how easily you can count the number of Storage accounts that you are using?

To make the problem a little more complex, you should know that a part of these quotas are per Azure Region. For example, you have a default limit of 50 VNETs per Azure Region. It means that it would be pretty hard to calculate the total number of VNETs that you are using on each Azure Region. It is not impossible, but you would need to do some additional things.

Usage + Quotas
Inside Azure Portal, we have a dedicated tile that provide this informat…

[Post Event] Codecamp Timisoara, October 14, 2017

In October 14 I attended to Codecamp Timisoara. It was a great experience, where I met great people from Timisoara and other cities around Romania.
At this conference I delivered a session about enterprise security and how you mitigate different security aspects when you want to migrate to Microsoft Azure. Below, you can find more information related to it.


Title: Enterprise security in Practice
Abstract: How does an enterprise looks like when you talk about security and cloud? Complicated, rigid and challenging to accept cloud based architectures. After working closely with security teams from different companies I identified different approaches and requirements that are standard for enterprises. In this session I want to discuss and share with you lessons learned on how we can map security requirements to Azure.
Slides:

Enterprise security in Practice from Radu Vunvulea

Migrating File Server (Share) to Azure

Let’s take a look on what are the current solution on Azure when we want to migrate our local file storage from on-premises to cloud.

Context
On the current system, let us imagine that we have Windows File Server that it is used to share filers inside the company. The File Server is fully integrated with our Active Directory server and based on roles we allow/restrict access to different folders/files.


What we want to do?
We want to migrate to Azure this solution in such a way that we don’t need to manage the File Server machines and also to be able to have control on file sharing permissions using user roles (Active Directory integration).
Addition to this, we want to be able to attach as a shared folder or partition the shared content on the client machine.

Azure Files
An extremely powerful solution provided by Microsoft, which allow us to store our files in Azure and share them with others. The SMB protocol is fully supported, meaning that we can attach the shared on our client machin…

The real difference between an Azure VM with or without SSD

I want to talk about the real difference of an Azure VM with or without SSD. This is not a post with charts and artificial benchmarks; it is just a real story from the field.
Context
One of my fellows from my work came at me complaining about a performance issue related to SQL Server. On an Azure VM with Linux they used to have an SQL Server instance. The DB storage size was not to complex and the DB size was acceptable.
Problem
Every few hours a job has to be executed on the database. There is a lot of data processing inside it and it usually takes around 1 hour. From duration perspective this is not acceptable, there is a clear NFR that request the task to be executed under 30 minutes.
An audit was done to the VM and database and it was pretty clear that there is a problem at read and write operations. Many actions were happening at that level, causing the memory and storage to be at high levels.
Actions
The DB specialists reviewed the database structure and the job. Unfortunately, the…

Using SQL security features to isolate sensitive data inside a PoC

When writing a PoC you need to keep it as simple as possible and prove that from a technology perspective the project vision is feasible and is the right one. One of the fundamentals rules of a PoC is that it needs to cover things that are not general truth (e.g. You don’t want to prove that ASP.NET MVC can render HTML or expose an REST API).
Keeping a PoC as simple as possible can become a problem when you want to use customer data not only mocks data. When you have customer sensitive information, which should not be visible even to the development team you might end up in a strange situation.
The problem is not related on how you can do this. The biggest problem is the effort that you need to invest to create the deployment scripts or the automation mechanism that would allow the customer to deploy the solution in an isolated environment, where development team would not have access. This effort might require extra development effort that you don’t want to include in a PoC.
It is cl…

Microsoft Ignite in a glance (Orladon, FL, 2017)

What a week! Last week of September was crazy from all the people that are working with Microsoft stack. The biggest Microsoft conference took place in Orlando, Florida. More than 25.000 people attended this year at Microsoft Ignite and as usually, it was an event will all the tickets sold out.


There were many announcement that makes Microsoft a strong player for today needs, but also there is a clear vision where they want to go. Not only this, but it seems that the road is already defined and clear.
The currents needs of the market are covered by Microsoft with Azure Stack, offering a good ground for hybrid solutions. Now we can use the core services of Microsoft Azure not only on Azure, but also on our on-premises infrastructure using Azure Stack. What is more interesting from devops and IT perspective is that you have the same experience, the same dashboard and you use the same scripts (no change is required).
Mixed reality and AI are now more closers to the fields. Many companies…

Why you should never put a junior in a team that is stressed and bad-mannered

It is nothing abnormal for a team to be stressed. During that times they might not have time for nothing else. The focus of the team is to deliver the business requirements in the IT solution. Unfortunately, this time periods can be longer than expected, even if we know that this is not good.
I saw teams being more than 18 months in a phase like this. After a while in this state, they don’t even realize that they are in this state and what is the impact and project and people level.
In this post, I will try to focus on the impact that such a phase can have at juniors and mid-levels in unhealthy teams.

Why?
When you are a junior you are in moment of you carrier when you want to learn. You know that you have things to learn, you are usually a fresh graduate with good theory knowledge and you want to put this in practice.
I like to compare smart juniors with birds that have big and powerful wings, but they don't know to fly very well yet. They can reach the sky and accomplish many thi…

Azure Blob Storage - More storage and throughput

One of the core services of Microsoft Azure is Azure Storage that is used to store binary content, key-values pair (Azure Tables)  or message queue (Azure Queues). In today's post, we will discover how a small change in Azure Storage capabilities is changing our life and simplify our IT solutions.

Current solutions
The current maximum capacity of Azure Blob Storage used to be 500TB. Even if this might sounds a lot, there are multiple cases when you had to overcome this limits. If you have a system where devices and users are uploading content to your system, than you can reach easily 2-5TB per day that would force you to use a new Azure Storage account every 3 months.
To overcome this limitation, your solution needs to be able to manage Azure Storage accounts automatically. Besides being able to clean and archive content automatically, you will need a system that can create a storage account on the fly and redirect traffic to it. When you use multiple Storage Account, you are force…

Less than 1 week until Microsoft Ignite 2017

First time when I took part at Microsoft TechEd in 2012 in Amsterdam. It was one of my first conferences with more than 5k attendees. It was a wow, from all perspective. From then I participate to each TechEd and Microsoft Ignite.
At Microsoft Ignite, attendees have the opportunity not only to learn and discover new stuff, but also to meet people all around the globe. It is that week in the year when you can meet face to face Program Managers from Microsoft together with people that you talk over Twitter from Japan, Australia, UK and USA in only one place.

This year things will be a little different. It will be the first time when I participate at Microsoft Ignite not as attendee, but also as speaker. It is a joy to be invited to speak at a conference with more than 23000 attendees. If this is not enough, I will have 3 sessions where I will share my knowledge and experience related to IoT, security and NoSQL. If you want to find more about this subjects feel free to join my sessions …

Is security and data privacy important on tracker devices like Fitbit?

A few days ago, I read about how insecure Fitbit devices are. There was a lot of noise created around it, explaining different ways how you can hack Fitbit device to gain access to personal data. My first reaction when I saw the title of article was “So what!?” and let me explain why I don’t see this a life treating or something that will stop me to use my Fitbit.

Personal data
It is true that a tracker contains personal data, but let us be realistic and look on what data it has. Most of the trackers contains information related to your past activity, heartbeat, number of steps and in some cases GPS information.

Except GPS information, the rest of the data are not so sensitive. What do you think that a hacker can do if he knows that you done 10k steps this morning. Yes, he might know your habits and broke into your house when you are jogging or walk the dog. This scenario can be real, but the true is that there are so many ways to find out what are your habits that you would be impress…

The scope of a PoC

Let us talk about what it should be the scope of a PoC and what you should or you should not have in a PoC.

Purpose of PoC
First, we need to define what is the purpose of a PoC is. The main purpose is to demonstrate the principles that are covered in technical documents (that it is not just theory and diagrams).

Reusability
It is already a deja vu for me to hear people that they want to reuse the PoC output in the main project. This happens because many times the PoC scope is too big and does not covers only the ideas that needs to be demonstrated.
When you have a PoC that covers more than 15% of the implementation effort than you might have a problem. That is not a PoC anymore, it is a PILOT, that represents a system with a limited functionality that go in production. The Pilot might have many restrictions, from NFRs to business use cases that are covered, but it has some part that works.
You will never want to invest in a PoC more than it is necessary and you shall always push the ou…

Containerization without a microservices approach

The current trends are clear. We should develop software applications using only microservice approach. This sounds goods for new application, where system requirements guides us to go with a microservice approach.
But what happens for the other types of systems. We might need to develop a normal web application, with some backend processing behind it. No crazy NFR, no need to scale to 100.000 RPS or similar stuff.

Monolithic application
As an example let us imagine that, we need to develop a web application that resize our pictures to Instagram size (1x1). There are no special requirements related to availability or scalability and the load on the system is a low. The system is used just by our employees (less than 5.000) for company images that needs to be published on commercial web sites.
Of course, we can imagine a state of the art microservice implementation, with different services that scale by themselves. What if we do not need something like this, but is very appealing for us…