Skip to content

This site is automatic translation of, original in portuguese

Windows EOL and SQL 2008-Extension Options

As it is already known, the product lifecycle of Microsoft for 2019 include Windows and SQL 2008 RTM and R2.

image (1)



Why is it important?

This is a typical problem in large enterprises, controlling the product support lifecycle that is implemented.

This matter is not of less importance, since having the support finished implies:

  • New security threats, even those involving software breaches, are no longer available for expired systems
  • New features in new products have no guarantee of operation on expired products

The first item is very important. Imagine that your company is vulnerable to an attack like many we’ve seen, because only ONE SERVER in your environment is expired !!!

What do I do if I have products that expire?

Obviously the best option is to migrate ("TO-BE"), but we know that it is not always possible. What can help is to use products such as the Log Insights Service Map ( ).

But for those who can not upgrade, one option is to buy support via Premier for another 3 years, which is not cheap but it is possible to negotiate through your Microsoft account team.

The cost to extend support PER YEAR is equivalent to 75% of full software in the most current version.

However, Microsoft has offered a very interesting option that is to migrate to Azure "AS-IS" !!!!

That’s right, anyone migrating Azure to Windows 2008 and SQL Server 2008 will not have to worry as they will have free support for an additional 3 years.

We need not even argue that it is a strategy to increase the use of Azure, but very good financially for whatever workload it has.



Assisted Office 365 and Azure Adoption with FastTrack

By converting new customers who had on-premise products to online products we always have the initial impact of migration.

If the customer purchased in CSP (Cloud Solution Provider) mode the initial configuration is all performed by the partner and the migration of the data in general is also already included as a service. After all, it is important to remember that in CSP mode, whoever holds the account is the partner because it is a managed model.

In the Licensing Partners model, either with MPSA or Enterprise Agreement (EA), the owner of the account and the tenant is the client. This means that it is up to the customer to create the tenant, enable the services, configure and migrate the data.

How to kickoff Office 365 without pain and with the best structure?

The obvious answer would be to hire a Microsoft services partner specializing in Office 365 who will do the whole process but often is not what will be done.

In these cases, FastTrack can be triggered.

What is Microsoft FastTrack?

In basic terms FastTrack is a website content an entire repertoire of tools for those who already have or acquired Office 365 in direct contract (MPSA or EA).

When entering the site you can start by seeing a Dashboard of its current state as below:


Note that soon in the first part we see the name of my test tenant, the data including some company information and the manager of FastTrack, Engineer and Architect. Who are these figures?

Some clients, especially at adoption, have the benefit of engaging an MS team to assist with migration planning and execution.

This does not want to do that they will execute, but rather will guide and support in the process of creating the tenant, AD integration (AADSYNC), service configuration and the migration process itself.

To find out if you are eligible, see "Offers" and "Services":



The first item "Bids" are not migrations but documentation generated for compliance and filing.

The "Services" item is where you can request that Microsoft engage the team to perform the desired functions.

Note that not only Office 365, but also Windows Deployment Planning (in this case you need to have Planning Services voucher) and a partner to help with Windows 10 if you have not already migrated.

We also have the option of Azure, but it is only available for some countries and the customer needs to consume at least U $ 5000 a month.

In either case, Microsoft sends you an email with more information for you and will initiate the process according to the type of request.

And if I already have the tenant and I use it, what value do I have in FastTrack?

Still, it’s interesting. Go to the link

This site has a list of resources where you can download presentations, guides, e-mail templates and educational videos.

The only restriction is that all content is in English Sad smile

Anyway, tools like "Network Planner" to validate link need is very important for the first moment.

We can also highlight the videos and documents where we can learn more about resources and the step-by-step of a success story!

Scenario Design (Success Plans)

A very interesting option is the creation of Success Plans that can be seen in the first screen of this post.

When you create a plan and choose the product, you will be guided to a complete checklist where you can choose what you will do and the site will help you walk the right path.

A very helpful help when we are doing the implementation and we do not want to let something go!



And an interesting feature is that you can access videos to help in the adoption of the desired product by end users.



Whether it’s deployed, already running on just a few products or evolving the environment, FastTrack will be a huge help to success!

Azure Log Insights-Service Map

Many are familiar with the Log Insights that were once called the Operations Management Suite.

In this post I will highlight one of the many Log Insights solution plug-ins (called Solutions in the portal) that is Service MAP


Migrating a Datacenter is not just about taking servers from side to side, it is often necessary to migrate environments by application profile.

The purpose in these cases is to know which servers should be migrated together so as not to have communication problems both between the same application and between the service and the clients.

The problem is often to be able to map this, because few companies have a map of applications that list the servers and services used in each application, mainly Web applications and Databases.


Log Insights Solution Service Map solves this problem!

It maps all communications that are performed with the servers with the agent installed and mounts a complete map of the usage detailing ports, names, services and allowing drill-down to view the connections and a detail pane for each selected item.

Here are some prints I use to demonstrate the feature:


View the services on one of the servers and details of the selected server. Note that on the left side you can see the server details bar mapped from other Active Solutions in your Log Insigths.


Details of one of the servers that communicates with the host, with details of the communication and the server.


By opening the selected server on the previous screen I can see the details of it, including now the desktops and other servers that also use the selected target.


Viewing the communication details between the target server and the server with SQL Server where we can see the SQL communications for authentication, since the target is my Domain Controller.


Here we can see in the concept of groups where the servers that the group includes are mapped and can be used to create the maps of a certain application.

Based on the above graph, I can see that the T110 host has two main VMs that communicate with all clients and between them constantly.

If you are going to create a migration plan for my environment you would already know that they are the two major VMs that need to be activated together in the migration.


To use the Service Map you obviously must have a Log Analytics account already enabled and include the Solution.

The data collection is not performed by the normal agent of Log Insights, it is necessary to download a specific agent that can be found in the link below:

Soon after installing the Service Map agent you will already be able to view the maps and use groups.

Important: The Service Map only maintains data for a maximum of 1 hour, so it is a portal for immediate viewing as it has no history or analytical reports.

Full reference:

Controlling Costs in Azure with Cloudyn

Much has been said about Cloudyn’s purchase of Microsoft and how it would be integrated into Azure’s cost management.

The truth is that before Cloudyn Azure had few good tools to manage costs, involving:

  • Detail of costs and pre-defined periods (day, week, month, year, etc.)
  • Comparison between costs and planned budget
  • Higher costs
  • "Orphan" or expired objects
  • Others…

It was possible to use Power BI but required a very thorough knowledge of the data layer that Azure exported, leaving most customers without good support.

Thinking about that, when buying Cloudyn Microsoft made the tool available for free (some additional features are paid) that fulfills these tasks and with several additional and practical reports.

Installing and Configuring Cloudyn

The installation is nothing more than an application that exists in the Azure Marketplace, named Cost Management, but if you look for Cloudyn it will also appear:



Enter the data for notification and the business model that you use, usually the first two (EA or CSP). In the case of individual is for those who uses OPEN, Credit Card or MSDN signatures as is my case:


In the following screen will be requested data to find the signatures, in my case the MSDN offer and my Azure tenant, which can be found in the portal in Subscriptions:


From there Cloudyn already finds all the subscriptions associated with its user and links the subscriptions:



Using Cloudyn’s Budget Reports

Important: Data may take 3 to 4 days to be populated.

Reports are the high point of the tool, analytic cost reports based on budget are excellent.



For these reports to work, it is important to create the budget in the "Projection and Budget" option:


From there it is already possible to extract the Reports of Projected vs. Used, which is the great pain of Azure clients today.

Detailing Consumption and Optimizations

Cloudyn’s initial dashboard is instructional and informative in and of itself:


In Asset Controller it is possible to see a summary of what we are having of resources and the evolution of these resources:


One of the most important features is in Optimizer where we can see orphan resources or overallocations, which are the hints Cloudyn provides with costs.

See that in my case, it has 2 disks that are not linked to any VMs, ie paid the storage without using:



Already browsing the menus and running the reports we have a very interesting that is Cost Navigator where we can see several periods and detail the costs in the period:


And mainly, as commented on the previous topic, compare my Budget with the Realized:


Some other reports that I did not read here are interesting:


It is worth installing and using this tool, the cost of it in your environment is minimal in relation to the quality of the data presented.

It is important to remember that in many cases it is important to use TAGs to separate resources into groups, if necessary.

However, even without the TAGs it is possible to use filters in the reports for some more specific data.

Microsoft Advanced Thread Analytics (ATA)

Many customers I visit have no idea what the ATA is, even though it has EMS (Enterprise Mobility + Security) licensing.

Understanding the ATA

To better understand what ATA is, we need to remember what behavioral security products are (

This type of product is not based on malicious code that is downloaded from a DAT with code information that will be executed (virus signature).

In behavioral security services you analyze trends, common uses and suspicious activities, for example a user who has never logged in to a server is now an administrator and accesses various machines.

Installing the ATA

The installation is very simple because online communication is performed directly with an Azure URL that receives and processes with Machine Learning the received security log data.

To install just run the installer which is very simple and intuitive. After installing the server, we can install the Gateway that is the Domain Controller server that will be analyzed collecting the security logs.

Once installed the administration is very simple and it is possible to advance in the settings informing for example the SID of a user to serve as an invasion diagnosis, an IP range of vulnerable machines (in DMZ for example) and other resources.

Once installed the maintenance of it is automatic both the server and the gateways that are monitored.

Checking AD Security Issues

After a few days it is already possible to see in the panel some alerts, for example below the warning that some computers are using vulnerable encryption level:





This other example is a case of remote execution of commands and scripts by a remote server. Of course in this case I will close the warning, since it is an expected attitude because I have the Honolulu project on the same machine that runs WMI commands:



See that in both cases I can know what happened, who was the user and on what server / desktop the suspicious activity occurred.

In addition, the detection history helps us understand if this is a real call or just a specific activity.

Receiving Alerts and Reports

ATA allows you to configure the receipt of alerts and reports with the data.

I can run standalone reports:


Or schedule to receive by email every day, as well as alerts:


How to get the ATA

That is the question that many ask, but it is important to remember that as an online product, it can be purchased by anyone who has Microsoft 365 with Security (new EMS, the old EMS or else purchased individually.

Remembering that as it is a product linked to the O365, the acquisition is per user, even if standalone.

Let’s Talk about the Microsoft Honolulu Project?

The Honolulu project was heavily commented on some time ago and linked to a new Windows graphical interface or functionality.

Now on December 01 came a new Preview and documentation version of Honolulu and is already well mature and with final architecture defined.

What is the Honolulu Project?

It is a new MANAGEMENT interface for Windows Server.

This is not a replacement for Windows Server 2012/2016 Server but rather an interface based on new protocols for access and ease of use, in addition to management capillarity.

What are the advantages of Honolulu over Server Manager?

Server Manager is a very good tool, but it is based on local protocols (RPC, WinRM and others) and is based on a GUI that needs to be installed.

Honolulu is 100% web-based for data access and uses WinRM, WMI and PowerShell for server administration.

With Honolulu it is possible to do things that Server Manager does not do, such as running scripts, Windows Update, administering and monitoring VMs, etc.

On the other hand, Honolulu does not manage as many services as Server Manager, such as File Server, DHCP, DNS, etc. that continue to be managed by the MMC tools.

How to install Honolulu?

The installation is very simple, but you have to define the architecture.

Basically we can use installed on a single server and bind others in administration as nodes, or else install a server as Gateway to access others and facilitate traffic when we have many servers in a farm:


In general for these tools the ideal is to create a server with little memory and processing power (in the figure the second model) not to burden servers with other functions, since it creates a service for the Honolulu:


To download Honolulu, because it is still an Preview, you need to use the Windows Server product evaluation page at

How to manage a server with Honolulu?

Let’s go the basic screens. First we insert a server in the list and from there it is possible by any browser to see usage graphs, configure items, make remote connection, execute PowerShell commands, etc.

First, let’s add new servers, clusters or even Windows 10 Client:


In sequence simply indicate the user and choose the server / cluster you want to view:


The level of detail ranges from HW items to detailed graphs for each of the server / client ritual items being monitored:


Even some items such as physical disks, volumes and Storage Space can now be administered in Honolulu:


An interesting feature is that you can manage Windows Update remotely:


Managing VMs in a Hyper-V is also one of the highlights by the level of detail and the intuitive interface:



Finally, follow the Honolulu technical documentation link:

Azure Stack 1-Understand Solution

Now available in most countries of the world where Microsoft has Datacenters, the Azure Stack became a constant theme.

But first you need to understand the focus and composition of the solution.

How is composed?

The Azure Stack is a rack of servers with sizes and settings pre-determined, today available from Dell, HP, Lenovo and Cisco.


The HW manufacturer was approved and standardized, which ensures updates direct from the Azure Stack for both software and hardware.

Does that mean that I can’t use my own settings? Exactly, to ensure that the system updated and the hyper-converged work the drivers have to be type-approved and tested.

It is important to understand that all the Azure Stack is based on the hyper-converged, IE are used the technologies of SDN (Software Defined Network) and SDS (Software Defined Storage) or SDx in General as they are called.

That is, there is no dedicated storage. Each server has a part of 15 k SAS disks and SSD discs, with the Storage Space Direct (S2D) enabled. This allows the servers have their stores added to the share each other volumes.

To guarantee data with the S2D is guaranteed by the distribution of data between servers, as does the VMWare vSAM or Nutanix.

For whom?

Unlike what many people think, the Azure Stack does not target the customer who thinks Microsoft Azure expensive and yes it has limitations in relation to public clouds.

For example, some cases in Ignite were of Swisscom and KPMG of Sweden.

KPMG the scenario was the legislation and the requirement of some customers who didn’t want their audit data in public cloud for more that try to justify the given security. The solution was the Azure Stack where KPMG would have the same services used by other branches in the world, but on-premisse.

Already the case of Swisscom was to be a local Datacenter as the Azure has no one in the country. So, those customers who want to use public cloud services using private cloud Azure Stack to host their local services.

That is, the main customers are, among others:

  • Countries where there are no restrictions about legal store data in other countries
  • Data centers interested in offer services to your user the same interface of the Azure, but locally, for example in Brazil only have one DC Microsoft Azure and a traditional provider could use the Azure Stack as Avaliability point Group
  • Companies with high usage of computer resources based in IaaS and have own Datacenter
  • Companies with tradition on-cause you don’t want to view your data out of the environment but wish to use the Cloud model Publishes “in place” with easy maintenance and high level support

And that customer who thinks the Azure expensive, worth using the Stack? At the tip of the pencil, not because we need to remember that it is a rack and need cooling, energy, high floor and all the other costs involved in a physical DC.

How much does Azure Stack?

You must first view the cost of Hardware that can be sold differently by each of the 4 current manufacturers.

For example in the case of the Dell configurations start at 20 CORE servers 4 and 4.1 TB and can reach 12 servers per rack, and the maximum capacity of 4 Racks with 12 servers each.

In addition, we have the servers Low, Mid and High profille, where a rack with 12 servers High Profile capacity is 336 Core 6.1 TB RAM, 138TB, cache, 1.2 PB of disk!!

Now let’s talk about the cost of Software. It is important to remember that the Azure Stack has no software cost, or whether billing as a service, which includes:

  • Updates of the Software Stack
  • Driver updates and logical components
  • Pre-configuration of the provision and components and templates
  • Microsoft Azure support is the same as answering Azure Stack

That is to say, the Azure Stack has a cost for consumption, not with licensing, in “Pay-As-You-Use”, based on the table below:



Based on that, we have for example a VM A2 which costs U $130/month in Microsoft Azure, in the Azure Stack goes for $40/month.

Of course you must include the TCO Datacenter infrastructure, warranty and support of HW, and electric power administration in Microsoft Azure does not have.

Even so, large environments that already have the Datacenter becomes advantageous option for already include many of these embedded costs.

And if the customer does not want to pay for consumption?

It is also possible to get the cost per CORE, but personally I see no advantage because the cost increases for the following reasons:

  • The template variable “Pay-As-You-Use” scalability also reflects on the price decrease when the load
  • The disconnected model it is necessary to pay separately the Windows and SQL licensing in the model “Pay-As-You-Use” is built
  • In the disconnected model annual payment is upfront


All Azure services are available in the Azure Stack?

Not yet. As you can see in the table of prices the most important Yes.

For example, some types of VMs as G could not run on the Stack and the same with some high-capacity services such as Machine Learning and Cognitive Services.

It is possible to create plans and join different solutions to create complex workloads, as documented in


Azure became the flagship product of Microsoft and Stack integration between public and private clouds actually becomes a unique experience!

Visit the link and learn product details: