Skip to content

This site is automatic translation of http://www.marcelosincic.com.br, original in portuguese

System Center 2019 and Windows Server 2019 – Upgrade in place

As known, System Center came out in its new version, now following the same concept of Branch (Current Branch) of Windows. From now on we will see the versions following the number that indicates the edition:

Roadmap

The 2019 version of the suite had no changes to main layouts or features, but adds several new features.

Today we have new 1801 version, System Center 2019 is version 1901 and expected launch date is March.

These resources can be viewed at the following link: https://thesystemcenterblog.com/2018/09/25/whats-new-in-system-center-2019/

System Center Configuration Manager Upgrade

SCCM since the 2016 version is upgraded as a native and automatic feature. It has always been very stable and easy to perform, being available in Administration -> Updates and Services:

Upgrade SC (10)

Once started, you can go through the top bar menu and follow the whole installation step by step:

Upgrade SC (1)

Remember that it is not possible to interact with the upgrade after starting, but in case you choose to leave the features disabled in the menu shown in the first image, choose Features to include one of the new ones.

Personally I always prefer to install upgrades without selecting features and then include the ones I want, so I can study the impact and real need for more components running on the server.

System Center Service Manager Upgrade

Also simple to complete, enter the SCSM media and it will enter the upgrade mode where you will select which of the local servers is being updated. Remembering that it is important to know the structure to choose the correct server function that is being updated, in my case the Management Server :

Upgrade SC (2)

Upgrade SC (6)

The update is very smooth, and at the end it is already running. The new self-service portal now offers the HTML5 experience without the need for additional components:

Upgrade SC (9)

System Center Operations Manager Upgrade

Microsoft has really learned how to make system upgrades with System Center transparent, fast, and efficient. The same goes for SCOM.

Similar to SCSM, just include the media and run the upgrade mode:

Upgrade SC (3)

Upgrade SC (8)

The Warning message on the screen above exists from previous versions. Because System Center installers do not ask for a key, in some it is necessary to insert the key later.

To enter the key, run the SCOM PowerShell and use the command, remembering that now the System Center installation key is the same for the entire suite since the 2012 version:

Set-SCOMLicense -ProductId ‘xxxxx’

Upgrade from System Center Orchestrator and Virtual Machine Manager

To upgrade the SCO I had to uninstall the server first. The reason in my case was the installation of an update in the middle of the year that was beta and with that the automatic upgrade is not possible.

In these cases, uninstall the server with the Retain Database option turned on, even though the Orchestrator SCVMM is similar:

After uninstalling the previous version, or even for a refresh, redo the installation with the option to use an existing database:

Upgrade SC (7)

Upgrade SC (5)

Upgrade SC (12)

This means that the installation of both System Center Orchestrator and Virtual Machine Manager ends with the same existing data.

In many cases, Orchestrator and Virtual Machine Manager for in the middle of the installation with a generic database error, with the message: " DBSetup.exe fails with unknown error 0x800A0E7A"

If this happens in your case, download and install SQL Server 2012 Native Client – QFE available at https://www.microsoft.com/en-us/download/details.aspx?id=50402

Upgrading Windows Server 2019 with System Center Services

On some of the servers, before upgrading Windows, I upgraded System Center.

That’s because System Center 2019 is compatible with Windows Server 2012 R2, but not vice versa. This means that it is more reliable first to upgrade the services and then the Operating System that is also compatible.

Upgrade SC (11)

Conclusion

Upgrading your System Center servers is stable, but be sure to always have a backup of the databases if a problem occurs in these phases.

It is also important to remember the rules of order, in general the Management Servers before the other functions.

Advertisements

Operations Management Suite (OMS) is now Azure Monitoring

For some time OMS has been a tool that always aboard clients and events.

It is a very good product, with rich analysis and has evolved a lot in the last year, becoming the product that many think will replace System Center in the future.

What has changed in the interface?

The previous interface was simpler and in a portal the part as it is in the post below:

https://msincic.wordpress.com/2017/10/15/acquiring-and-licensing-the-azure-who-operations-management-suite/

Now the interface is integrated into the Azure panel, allows you to create new dashboards easily. In addition, it is possible to individually access each of the monitors.

image

image (1)

With this integration into the Azure interface it has become much easier and more functional.

And how was the licensing?

In the post where we had already approached the OMS we talked about the acquisition that was complex since each module was part of a bundle, and each bundle if solutions was separate payment. There was the option to buy per node or log upload, but there were limitation of solutions and modules in the upload model.

Now it’s much easier, there’s only one charging mode that is uploaded.

That is, you can now pay for the size of the logs you send, which is much more practical and simple!

https://azure.microsoft.com/en-us/blog/introducing-a-new-way-to-purchase-azure-monitoring-services/

image (2)

If you do not use Log Insights because you do not understand how to pay, it has now been simple and much cheaper!

Microsoft ATA-Recovery and Migration

We have already talked about Microsoft ATA (Advanced Threat Analytics) at https://msincic.wordpress.com/2018/02/26/microsoft-advanced-thread-analytics-ata/

Now there was a major upgrade with version 9 that made the ATA lighter in demand for features and display of the reports.

However, during the migration it is possible that connection losses to MongoDB occur and it is necessary to do the backup and restore.

The same process may be required when switching ATA servers.

Important: The Windows Security Log data is sent to Machine Learning to generate the incidents and alerts, but are hosted locally. So if you lose the server you will no longer have the reports and incidents already registered.

Performing ATA Backup

To back up the ATA configuration, use the copy of the SystemProfile_yyyymmddhhmm.json file located in the ATA installation folder in a Backup subdirectory along with the last 300 copies of the data.

This SystemProfile file is the MongoDB database in JSON format, eliminating the need to back up from Atlas or other MongoDB specific administration tool. This is very good, since it is not common to know MongoDB administration.

To work, you must have the certificate copy used for JSON file encryption, which is generated during installation (Self-signed).

The certificate copy only needs to be done once, open the MMC console with the Certificates snap-in, and find the ATA Central certificate certificate in the People certificates area on Local Machine .

With these steps we have the backup of the server configurations that are the JSON and the certificate. But what about ATA data?

To back up the ATA it is necessary, as already mentioned, to know the MongoDB tools and maybe you should think about whether you need them once they have been solved.

If you need to keep alerts and incidents, follow the document at https://docs.mongodb.com/manual/core/backups/ on how to back up the database.

Performing ATA Restore

The restore part of ATA in a new server or configuration of a new version is a bit more complicated than the backup that is quite simple.

You must first import the certificate exported in the previous step into the same tree that you did in the previous step.

You then need to reinstall the new ATA server with the same name and the previous IP, and at the time it requests the certificate, disable the Create Self-signed option to choose the original certificate.

In sequence we need to stop the Centro ATA service so that we can open MongoDB and import the JSON file with the following commands:

  • mongo.exe ATA
  • db.SystemProfile.remove ({})
  • mongoimport.exe –db ATA –collection SystemProfile –file "<JSON File> –upsert

Note: The first command opens the instance, the second removes the empty settings, and the third one imports the new configuration.

It is not necessary to re-create the Gateways because they are mapped automatically when you restore the settings.

If you have backed up the MongoDB database, follow the base restore procedure before restarting the ATA service.

Reference: https://docs.microsoft.com/en-us/advanced-threat-analytics/disaster-recovery

Windows EOL and SQL 2008-Extension Options

As it is already known, the product lifecycle of Microsoft for 2019 include Windows and SQL 2008 RTM and R2.

image (1)

image

Source: https://support.microsoft.com/en-us/lifecycle/search

Why is it important?

This is a typical problem in large enterprises, controlling the product support lifecycle that is implemented.

This matter is not of less importance, since having the support finished implies:

  • New security threats, even those involving software breaches, are no longer available for expired systems
  • New features in new products have no guarantee of operation on expired products

The first item is very important. Imagine that your company is vulnerable to an attack like many we’ve seen, because only ONE SERVER in your environment is expired !!!

What do I do if I have products that expire?

Obviously the best option is to migrate ("TO-BE"), but we know that it is not always possible. What can help is to use products such as the Log Insights Service Map ( http://www.marcelosincic.com.br/post/Azure-Log-Insigths-Service-Map.aspx ).

But for those who can not upgrade, one option is to buy support via Premier for another 3 years, which is not cheap but it is possible to negotiate through your Microsoft account team.

The cost to extend support PER YEAR is equivalent to 75% of full software in the most current version.

However, Microsoft has offered a very interesting option that is to migrate to Azure "AS-IS" !!!!

That’s right, anyone migrating Azure to Windows 2008 and SQL Server 2008 will not have to worry as they will have free support for an additional 3 years.

https://azure.microsoft.com/en-us/blog/announcing-new-options-for-sql-server-2008-and-windows-server-2008-end-of-support/

We need not even argue that it is a strategy to increase the use of Azure, but very good financially for whatever workload it has.

tela1

Assisted Office 365 and Azure Adoption with FastTrack

By converting new customers who had on-premise products to online products we always have the initial impact of migration.

If the customer purchased in CSP (Cloud Solution Provider) mode the initial configuration is all performed by the partner and the migration of the data in general is also already included as a service. After all, it is important to remember that in CSP mode, whoever holds the account is the partner because it is a managed model.

In the Licensing Partners model, either with MPSA or Enterprise Agreement (EA), the owner of the account and the tenant is the client. This means that it is up to the customer to create the tenant, enable the services, configure and migrate the data.

How to kickoff Office 365 without pain and with the best structure?

The obvious answer would be to hire a Microsoft services partner specializing in Office 365 who will do the whole process but often is not what will be done.

In these cases, FastTrack can be triggered.

What is Microsoft FastTrack?

In basic terms FastTrack is a website content an entire repertoire of tools for those who already have or acquired Office 365 in direct contract (MPSA or EA).

https://fasttrack.microsoft.com

When entering the site you can start by seeing a Dashboard of its current state as below:

Tela1

Note that soon in the first part we see the name of my test tenant, the data including some company information and the manager of FastTrack, Engineer and Architect. Who are these figures?

Some clients, especially at adoption, have the benefit of engaging an MS team to assist with migration planning and execution.

This does not want to do that they will execute, but rather will guide and support in the process of creating the tenant, AD integration (AADSYNC), service configuration and the migration process itself.

To find out if you are eligible, see "Offers" and "Services":

Tela2

Tela3

The first item "Bids" are not migrations but documentation generated for compliance and filing.

The "Services" item is where you can request that Microsoft engage the team to perform the desired functions.

Note that not only Office 365, but also Windows Deployment Planning (in this case you need to have Planning Services voucher) and a partner to help with Windows 10 if you have not already migrated.

We also have the option of Azure, but it is only available for some countries and the customer needs to consume at least U $ 5000 a month.

In either case, Microsoft sends you an email with more information for you and will initiate the process according to the type of request.

And if I already have the tenant and I use it, what value do I have in FastTrack?

Still, it’s interesting. Go to the https://myadvisor.fasttrack.microsoft.com link

This site has a list of resources where you can download presentations, guides, e-mail templates and educational videos.

The only restriction is that all content is in English Sad smile

Anyway, tools like "Network Planner" to validate link need is very important for the first moment.

We can also highlight the videos and documents where we can learn more about resources and the step-by-step of a success story!

Scenario Design (Success Plans)

A very interesting option is the creation of Success Plans that can be seen in the first screen of this post.

When you create a plan and choose the product, you will be guided to a complete checklist where you can choose what you will do and the site will help you walk the right path.

A very helpful help when we are doing the implementation and we do not want to let something go!

tela4

tela5

And an interesting feature is that you can access videos to help in the adoption of the desired product by end users.

tela6

Conclusion

Whether it’s deployed, already running on just a few products or evolving the environment, FastTrack will be a huge help to success!

Azure Log Insights-Service Map

Many are familiar with the Log Insights that were once called the Operations Management Suite.

In this post I will highlight one of the many Log Insights solution plug-ins (called Solutions in the portal) that is Service MAP

NEED

Migrating a Datacenter is not just about taking servers from side to side, it is often necessary to migrate environments by application profile.

The purpose in these cases is to know which servers should be migrated together so as not to have communication problems both between the same application and between the service and the clients.

The problem is often to be able to map this, because few companies have a map of applications that list the servers and services used in each application, mainly Web applications and Databases.

SOLUTION

Log Insights Solution Service Map solves this problem!

It maps all communications that are performed with the servers with the agent installed and mounts a complete map of the usage detailing ports, names, services and allowing drill-down to view the connections and a detail pane for each selected item.

Here are some prints I use to demonstrate the feature:

capture20180405193706451

View the services on one of the servers and details of the selected server. Note that on the left side you can see the server details bar mapped from other Active Solutions in your Log Insigths.

capture20180405193730890

Details of one of the servers that communicates with the host, with details of the communication and the server.

capture20180405193730890

By opening the selected server on the previous screen I can see the details of it, including now the desktops and other servers that also use the selected target.

capture20180405193906565

Viewing the communication details between the target server and the server with SQL Server where we can see the SQL communications for authentication, since the target is my Domain Controller.

Grupo

Here we can see in the concept of groups where the servers that the group includes are mapped and can be used to create the maps of a certain application.

Based on the above graph, I can see that the T110 host has two main VMs that communicate with all clients and between them constantly.

If you are going to create a migration plan for my environment you would already know that they are the two major VMs that need to be activated together in the migration.

USING THE SERVICE MAP

To use the Service Map you obviously must have a Log Analytics account already enabled and include the Solution.

The data collection is not performed by the normal agent of Log Insights, it is necessary to download a specific agent that can be found in the link below:

https://docs.microsoft.com/en-us/azure/monitoring/monitoring-service-map-configure

Soon after installing the Service Map agent you will already be able to view the maps and use groups.

Important: The Service Map only maintains data for a maximum of 1 hour, so it is a portal for immediate viewing as it has no history or analytical reports.

Full reference: https://docs.microsoft.com/en-us/azure/monitoring/monitoring-service-map

Controlling Costs in Azure with Cloudyn

Much has been said about Cloudyn’s purchase of Microsoft and how it would be integrated into Azure’s cost management.

The truth is that before Cloudyn Azure had few good tools to manage costs, involving:

  • Detail of costs and pre-defined periods (day, week, month, year, etc.)
  • Comparison between costs and planned budget
  • Higher costs
  • "Orphan" or expired objects
  • Others…

It was possible to use Power BI but required a very thorough knowledge of the data layer that Azure exported, leaving most customers without good support.

Thinking about that, when buying Cloudyn Microsoft made the tool available for free (some additional features are paid) that fulfills these tasks and with several additional and practical reports.

Installing and Configuring Cloudyn

The installation is nothing more than an application that exists in the Azure Marketplace, named Cost Management, but if you look for Cloudyn it will also appear:

capture20180306180552915

capture20180306180627270

Enter the data for notification and the business model that you use, usually the first two (EA or CSP). In the case of individual is for those who uses OPEN, Credit Card or MSDN signatures as is my case:

capture20180306180730866

In the following screen will be requested data to find the signatures, in my case the MSDN offer and my Azure tenant, which can be found in the portal in Subscriptions:

capture20180306180850256

From there Cloudyn already finds all the subscriptions associated with its user and links the subscriptions:

capture20180306181608327

capture20180306181707726

Using Cloudyn’s Budget Reports

Important: Data may take 3 to 4 days to be populated.

Reports are the high point of the tool, analytic cost reports based on budget are excellent.

capture20180306181748042

capture20180306181949093

For these reports to work, it is important to create the budget in the "Projection and Budget" option:

capture20180306182422406

From there it is already possible to extract the Reports of Projected vs. Used, which is the great pain of Azure clients today.

Detailing Consumption and Optimizations

Cloudyn’s initial dashboard is instructional and informative in and of itself:

capture20180312104401184

In Asset Controller it is possible to see a summary of what we are having of resources and the evolution of these resources:

capture20180312104510959

One of the most important features is in Optimizer where we can see orphan resources or overallocations, which are the hints Cloudyn provides with costs.

See that in my case, it has 2 disks that are not linked to any VMs, ie paid the storage without using:

capture20180312104525928

Discos

Already browsing the menus and running the reports we have a very interesting that is Cost Navigator where we can see several periods and detail the costs in the period:

capture20180312104621971

And mainly, as commented on the previous topic, compare my Budget with the Realized:

capture20180312104736182

Some other reports that I did not read here are interesting:

CONCLUSION

It is worth installing and using this tool, the cost of it in your environment is minimal in relation to the quality of the data presented.

It is important to remember that in many cases it is important to use TAGs to separate resources into groups, if necessary.

However, even without the TAGs it is possible to use filters in the reports for some more specific data.