Skip to content

This site is automatic translation of http://www.marcelosincic.com.br, original in portuguese

Microsoft ATA-Recovery and Migration

We have already talked about Microsoft ATA (Advanced Threat Analytics) at https://msincic.wordpress.com/2018/02/26/microsoft-advanced-thread-analytics-ata/

Now there was a major upgrade with version 9 that made the ATA lighter in demand for features and display of the reports.

However, during the migration it is possible that connection losses to MongoDB occur and it is necessary to do the backup and restore.

The same process may be required when switching ATA servers.

Important: The Windows Security Log data is sent to Machine Learning to generate the incidents and alerts, but are hosted locally. So if you lose the server you will no longer have the reports and incidents already registered.

Performing ATA Backup

To back up the ATA configuration, use the copy of the SystemProfile_yyyymmddhhmm.json file located in the ATA installation folder in a Backup subdirectory along with the last 300 copies of the data.

This SystemProfile file is the MongoDB database in JSON format, eliminating the need to back up from Atlas or other MongoDB specific administration tool. This is very good, since it is not common to know MongoDB administration.

To work, you must have the certificate copy used for JSON file encryption, which is generated during installation (Self-signed).

The certificate copy only needs to be done once, open the MMC console with the Certificates snap-in, and find the ATA Central certificate certificate in the People certificates area on Local Machine .

With these steps we have the backup of the server configurations that are the JSON and the certificate. But what about ATA data?

To back up the ATA it is necessary, as already mentioned, to know the MongoDB tools and maybe you should think about whether you need them once they have been solved.

If you need to keep alerts and incidents, follow the document at https://docs.mongodb.com/manual/core/backups/ on how to back up the database.

Performing ATA Restore

The restore part of ATA in a new server or configuration of a new version is a bit more complicated than the backup that is quite simple.

You must first import the certificate exported in the previous step into the same tree that you did in the previous step.

You then need to reinstall the new ATA server with the same name and the previous IP, and at the time it requests the certificate, disable the Create Self-signed option to choose the original certificate.

In sequence we need to stop the Centro ATA service so that we can open MongoDB and import the JSON file with the following commands:

  • mongo.exe ATA
  • db.SystemProfile.remove ({})
  • mongoimport.exe –db ATA –collection SystemProfile –file "<JSON File> –upsert

Note: The first command opens the instance, the second removes the empty settings, and the third one imports the new configuration.

It is not necessary to re-create the Gateways because they are mapped automatically when you restore the settings.

If you have backed up the MongoDB database, follow the base restore procedure before restarting the ATA service.

Reference: https://docs.microsoft.com/en-us/advanced-threat-analytics/disaster-recovery

Advertisements

Windows EOL and SQL 2008-Extension Options

As it is already known, the product lifecycle of Microsoft for 2019 include Windows and SQL 2008 RTM and R2.

image (1)

image

Source: https://support.microsoft.com/en-us/lifecycle/search

Why is it important?

This is a typical problem in large enterprises, controlling the product support lifecycle that is implemented.

This matter is not of less importance, since having the support finished implies:

  • New security threats, even those involving software breaches, are no longer available for expired systems
  • New features in new products have no guarantee of operation on expired products

The first item is very important. Imagine that your company is vulnerable to an attack like many we’ve seen, because only ONE SERVER in your environment is expired !!!

What do I do if I have products that expire?

Obviously the best option is to migrate ("TO-BE"), but we know that it is not always possible. What can help is to use products such as the Log Insights Service Map ( http://www.marcelosincic.com.br/post/Azure-Log-Insigths-Service-Map.aspx ).

But for those who can not upgrade, one option is to buy support via Premier for another 3 years, which is not cheap but it is possible to negotiate through your Microsoft account team.

The cost to extend support PER YEAR is equivalent to 75% of full software in the most current version.

However, Microsoft has offered a very interesting option that is to migrate to Azure "AS-IS" !!!!

That’s right, anyone migrating Azure to Windows 2008 and SQL Server 2008 will not have to worry as they will have free support for an additional 3 years.

https://azure.microsoft.com/en-us/blog/announcing-new-options-for-sql-server-2008-and-windows-server-2008-end-of-support/

We need not even argue that it is a strategy to increase the use of Azure, but very good financially for whatever workload it has.

tela1

Assisted Office 365 and Azure Adoption with FastTrack

By converting new customers who had on-premise products to online products we always have the initial impact of migration.

If the customer purchased in CSP (Cloud Solution Provider) mode the initial configuration is all performed by the partner and the migration of the data in general is also already included as a service. After all, it is important to remember that in CSP mode, whoever holds the account is the partner because it is a managed model.

In the Licensing Partners model, either with MPSA or Enterprise Agreement (EA), the owner of the account and the tenant is the client. This means that it is up to the customer to create the tenant, enable the services, configure and migrate the data.

How to kickoff Office 365 without pain and with the best structure?

The obvious answer would be to hire a Microsoft services partner specializing in Office 365 who will do the whole process but often is not what will be done.

In these cases, FastTrack can be triggered.

What is Microsoft FastTrack?

In basic terms FastTrack is a website content an entire repertoire of tools for those who already have or acquired Office 365 in direct contract (MPSA or EA).

https://fasttrack.microsoft.com

When entering the site you can start by seeing a Dashboard of its current state as below:

Tela1

Note that soon in the first part we see the name of my test tenant, the data including some company information and the manager of FastTrack, Engineer and Architect. Who are these figures?

Some clients, especially at adoption, have the benefit of engaging an MS team to assist with migration planning and execution.

This does not want to do that they will execute, but rather will guide and support in the process of creating the tenant, AD integration (AADSYNC), service configuration and the migration process itself.

To find out if you are eligible, see "Offers" and "Services":

Tela2

Tela3

The first item "Bids" are not migrations but documentation generated for compliance and filing.

The "Services" item is where you can request that Microsoft engage the team to perform the desired functions.

Note that not only Office 365, but also Windows Deployment Planning (in this case you need to have Planning Services voucher) and a partner to help with Windows 10 if you have not already migrated.

We also have the option of Azure, but it is only available for some countries and the customer needs to consume at least U $ 5000 a month.

In either case, Microsoft sends you an email with more information for you and will initiate the process according to the type of request.

And if I already have the tenant and I use it, what value do I have in FastTrack?

Still, it’s interesting. Go to the https://myadvisor.fasttrack.microsoft.com link

This site has a list of resources where you can download presentations, guides, e-mail templates and educational videos.

The only restriction is that all content is in English Sad smile

Anyway, tools like "Network Planner" to validate link need is very important for the first moment.

We can also highlight the videos and documents where we can learn more about resources and the step-by-step of a success story!

Scenario Design (Success Plans)

A very interesting option is the creation of Success Plans that can be seen in the first screen of this post.

When you create a plan and choose the product, you will be guided to a complete checklist where you can choose what you will do and the site will help you walk the right path.

A very helpful help when we are doing the implementation and we do not want to let something go!

tela4

tela5

And an interesting feature is that you can access videos to help in the adoption of the desired product by end users.

tela6

Conclusion

Whether it’s deployed, already running on just a few products or evolving the environment, FastTrack will be a huge help to success!

Azure Log Insights-Service Map

Many are familiar with the Log Insights that were once called the Operations Management Suite.

In this post I will highlight one of the many Log Insights solution plug-ins (called Solutions in the portal) that is Service MAP

NEED

Migrating a Datacenter is not just about taking servers from side to side, it is often necessary to migrate environments by application profile.

The purpose in these cases is to know which servers should be migrated together so as not to have communication problems both between the same application and between the service and the clients.

The problem is often to be able to map this, because few companies have a map of applications that list the servers and services used in each application, mainly Web applications and Databases.

SOLUTION

Log Insights Solution Service Map solves this problem!

It maps all communications that are performed with the servers with the agent installed and mounts a complete map of the usage detailing ports, names, services and allowing drill-down to view the connections and a detail pane for each selected item.

Here are some prints I use to demonstrate the feature:

capture20180405193706451

View the services on one of the servers and details of the selected server. Note that on the left side you can see the server details bar mapped from other Active Solutions in your Log Insigths.

capture20180405193730890

Details of one of the servers that communicates with the host, with details of the communication and the server.

capture20180405193730890

By opening the selected server on the previous screen I can see the details of it, including now the desktops and other servers that also use the selected target.

capture20180405193906565

Viewing the communication details between the target server and the server with SQL Server where we can see the SQL communications for authentication, since the target is my Domain Controller.

Grupo

Here we can see in the concept of groups where the servers that the group includes are mapped and can be used to create the maps of a certain application.

Based on the above graph, I can see that the T110 host has two main VMs that communicate with all clients and between them constantly.

If you are going to create a migration plan for my environment you would already know that they are the two major VMs that need to be activated together in the migration.

USING THE SERVICE MAP

To use the Service Map you obviously must have a Log Analytics account already enabled and include the Solution.

The data collection is not performed by the normal agent of Log Insights, it is necessary to download a specific agent that can be found in the link below:

https://docs.microsoft.com/en-us/azure/monitoring/monitoring-service-map-configure

Soon after installing the Service Map agent you will already be able to view the maps and use groups.

Important: The Service Map only maintains data for a maximum of 1 hour, so it is a portal for immediate viewing as it has no history or analytical reports.

Full reference: https://docs.microsoft.com/en-us/azure/monitoring/monitoring-service-map

Controlling Costs in Azure with Cloudyn

Much has been said about Cloudyn’s purchase of Microsoft and how it would be integrated into Azure’s cost management.

The truth is that before Cloudyn Azure had few good tools to manage costs, involving:

  • Detail of costs and pre-defined periods (day, week, month, year, etc.)
  • Comparison between costs and planned budget
  • Higher costs
  • "Orphan" or expired objects
  • Others…

It was possible to use Power BI but required a very thorough knowledge of the data layer that Azure exported, leaving most customers without good support.

Thinking about that, when buying Cloudyn Microsoft made the tool available for free (some additional features are paid) that fulfills these tasks and with several additional and practical reports.

Installing and Configuring Cloudyn

The installation is nothing more than an application that exists in the Azure Marketplace, named Cost Management, but if you look for Cloudyn it will also appear:

capture20180306180552915

capture20180306180627270

Enter the data for notification and the business model that you use, usually the first two (EA or CSP). In the case of individual is for those who uses OPEN, Credit Card or MSDN signatures as is my case:

capture20180306180730866

In the following screen will be requested data to find the signatures, in my case the MSDN offer and my Azure tenant, which can be found in the portal in Subscriptions:

capture20180306180850256

From there Cloudyn already finds all the subscriptions associated with its user and links the subscriptions:

capture20180306181608327

capture20180306181707726

Using Cloudyn’s Budget Reports

Important: Data may take 3 to 4 days to be populated.

Reports are the high point of the tool, analytic cost reports based on budget are excellent.

capture20180306181748042

capture20180306181949093

For these reports to work, it is important to create the budget in the "Projection and Budget" option:

capture20180306182422406

From there it is already possible to extract the Reports of Projected vs. Used, which is the great pain of Azure clients today.

Detailing Consumption and Optimizations

Cloudyn’s initial dashboard is instructional and informative in and of itself:

capture20180312104401184

In Asset Controller it is possible to see a summary of what we are having of resources and the evolution of these resources:

capture20180312104510959

One of the most important features is in Optimizer where we can see orphan resources or overallocations, which are the hints Cloudyn provides with costs.

See that in my case, it has 2 disks that are not linked to any VMs, ie paid the storage without using:

capture20180312104525928

Discos

Already browsing the menus and running the reports we have a very interesting that is Cost Navigator where we can see several periods and detail the costs in the period:

capture20180312104621971

And mainly, as commented on the previous topic, compare my Budget with the Realized:

capture20180312104736182

Some other reports that I did not read here are interesting:

CONCLUSION

It is worth installing and using this tool, the cost of it in your environment is minimal in relation to the quality of the data presented.

It is important to remember that in many cases it is important to use TAGs to separate resources into groups, if necessary.

However, even without the TAGs it is possible to use filters in the reports for some more specific data.

Microsoft Advanced Thread Analytics (ATA)

Many customers I visit have no idea what the ATA is, even though it has EMS (Enterprise Mobility + Security) licensing. https://www.microsoft.com/en-us/cloud-platform/advanced-threat-analytics

Understanding the ATA

To better understand what ATA is, we need to remember what behavioral security products are (https://msincic.wordpress.com/2016/07/24/windows-defender-atp-the-new-security-product/).

This type of product is not based on malicious code that is downloaded from a DAT with code information that will be executed (virus signature).

In behavioral security services you analyze trends, common uses and suspicious activities, for example a user who has never logged in to a server is now an administrator and accesses various machines.

Installing the ATA

The installation is very simple because online communication is performed directly with an Azure URL that receives and processes with Machine Learning the received security log data.

To install just run the installer which is very simple and intuitive. After installing the server, we can install the Gateway that is the Domain Controller server that will be analyzed collecting the security logs.

Once installed the administration is very simple and it is possible to advance in the settings informing for example the SID of a user to serve as an invasion diagnosis, an IP range of vulnerable machines (in DMZ for example) and other resources.

Once installed the maintenance of it is automatic both the server and the gateways that are monitored.

Checking AD Security Issues

After a few days it is already possible to see in the panel some alerts, for example below the warning that some computers are using vulnerable encryption level:

capture20170807171826449

capture20170807171926453

capture20170807171951836

capture20170807172020133

This other example is a case of remote execution of commands and scripts by a remote server. Of course in this case I will close the warning, since it is an expected attitude because I have the Honolulu project on the same machine that runs WMI commands:

capture20180226144319686

capture20180226144405535

See that in both cases I can know what happened, who was the user and on what server / desktop the suspicious activity occurred.

In addition, the detection history helps us understand if this is a real call or just a specific activity.

Receiving Alerts and Reports

ATA allows you to configure the receipt of alerts and reports with the data.

I can run standalone reports:

capture20170807172144683

Or schedule to receive by email every day, as well as alerts:

capture20170807172207309

How to get the ATA

That is the question that many ask, but it is important to remember that as an online product, it can be purchased by anyone who has Microsoft 365 with Security (new EMS, the old EMS or else purchased individually.

Remembering that as it is a product linked to the O365, the acquisition is per user, even if standalone.

Let’s Talk about the Microsoft Honolulu Project?

The Honolulu project was heavily commented on some time ago and linked to a new Windows graphical interface or functionality.

Now on December 01 came a new Preview and documentation version of Honolulu and is already well mature and with final architecture defined.

What is the Honolulu Project?

It is a new MANAGEMENT interface for Windows Server.

This is not a replacement for Windows Server 2012/2016 Server but rather an interface based on new protocols for access and ease of use, in addition to management capillarity.

What are the advantages of Honolulu over Server Manager?

Server Manager is a very good tool, but it is based on local protocols (RPC, WinRM and others) and is based on a GUI that needs to be installed.

Honolulu is 100% web-based for data access and uses WinRM, WMI and PowerShell for server administration.

With Honolulu it is possible to do things that Server Manager does not do, such as running scripts, Windows Update, administering and monitoring VMs, etc.

On the other hand, Honolulu does not manage as many services as Server Manager, such as File Server, DHCP, DNS, etc. that continue to be managed by the MMC tools.

How to install Honolulu?

The installation is very simple, but you have to define the architecture.

Basically we can use installed on a single server and bind others in administration as nodes, or else install a server as Gateway to access others and facilitate traffic when we have many servers in a farm:

deployment

In general for these tools the ideal is to create a server with little memory and processing power (in the figure the second model) not to burden servers with other functions, since it creates a service for the Honolulu:

capture20180108110941303

To download Honolulu, because it is still an Preview, you need to use the Windows Server product evaluation page at https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-honolulu

How to manage a server with Honolulu?

Let’s go the basic screens. First we insert a server in the list and from there it is possible by any browser to see usage graphs, configure items, make remote connection, execute PowerShell commands, etc.

First, let’s add new servers, clusters or even Windows 10 Client:

capture20180108103235350

In sequence simply indicate the user and choose the server / cluster you want to view:

capture20180108103532804

The level of detail ranges from HW items to detailed graphs for each of the server / client ritual items being monitored:

capture20180108104007877

Even some items such as physical disks, volumes and Storage Space can now be administered in Honolulu:

capture20180108104156585

An interesting feature is that you can manage Windows Update remotely:

capture20180108104311080

Managing VMs in a Hyper-V is also one of the highlights by the level of detail and the intuitive interface:

capture20180108104402669

capture20180108104503812

Finally, follow the Honolulu technical documentation link: https://docs.microsoft.com/en-us/windows-server/manage/honolulu/honolulu