Disable Dedup

Posted by robd on September 23, 2018
powershell, Server 2012 / No Comments

How to disable Dedup:

First an important point about disabling dedup (via GUI or PowerShell), when you disable it only stops further deduplication from occurring i.e data that has already been deduplicated will remain deduplicated

If you want to “move” the data back to the original files and out of the deduplication store (Chunk Store) you need to use powershell command

You can check the status on where this is at by using

Here’s another gotcha, chunk size (love that name) will not get smaller until you run two more commands, GarbageCollection and Scrubbing.  GargabeCollection will find and remove unreferenced chunks and scrubbing will perform an integrity check but this wont work unless dedup is on….so enable dedup:

Then run garage collection:

Once your drive is small again then disable dedup:

Tags: , ,

Microsoft Dedup

Posted by robd on September 22, 2018
Server 2012 / No Comments

I posted about Microsoft Dedup recently and thought I should mention how to setup dedupe:

Data deduplication is a feature that allows space reduction on a data volume by removing duplicate copy of data and replacing it with a reference file that looks exactly the same to the end user.

Microsoft does not recommend dedup on databases such as .edb, .mdf and .ldf files. This feature help IT admins reduce storage costs if it’s applied to the right data such as File shares such as home folders.

Below is the recommendation of dedup feature based on data type

Recommend Deduplication File Servers, VHD Files, Software Repositories, Backups and other static data.
Not Recommended Virtualization Hosts, WSUS, Database servers or any data that changes very frequently.

Requirements

  • Windows Server 2012 Operating system
  • At least 4GB of RAM
  • 1 CPU core and 350MB of RAM for every 1.5TB worth of data
  • Must be on non-system volume such as boot volume
  • Mapped drives via net use is not supported. Must be a local volume.
  • Must be using NTFS with MBR or GPT partition
  • Not supported on ReFS file system

To install the deduplication feature, use the Server Manager – Server Manager > Server Roles > File and storage services > File services > Data Deduplication.

Or PowerShell

Run below powershell commands to install the feature
To turn on deduplication feature, use below command (where E is the volume)
To Set the minimum file age before deduplication
To get a list of deduped volumes, run
To get dedup status, run
To start a dedup job manually, run
To get current dedup schedule, run

How to calculate dedup rate

Installing the “Data Deduplication” feature will automatically install the DDPEVAL.exe in c:\windows\system32 . This tool will allow you determine if deduplication is effective your data type.

This tool can be copied from any server running Windows Server 2012 R2 or Windows Server 2012 to systems running Windows Server 2012, Windows Server 2008 R2, or Windows 7. You can use it to determine the expected savings that you would get if deduplication is enabled on a particular volume.

To use:

More info:

https://docs.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/hh831700(v=ws.11)

 

Dedup and Chunk Store is Huge!

Posted by robd on September 21, 2018
powershell, Server 2012 / 1 Comment

Found a drive was running low on space today and on closer inspection with tree size I found that ChunkStore (brilliant name) was taking up the drive space:

Odd as it looks as dedup wasn’t working:

To fix it I ran the following PowerShell:

What does this do I hear you say, Garbage collection is the process to remove “data chunks” that are no longer referenced i.e. to remove references to deleted files and folders. This process deleted content to free up additional space. Data scrubbing checks integrity and validate the checksum data.

To monitor it I ran:

This seems to have fixed it for me:

Tags: ,

Wireless – Insulation

Posted by robd on September 06, 2018
Wireless / No Comments

Hello,

I’ve been meaning to post something on wirelss for a while, actually since I gained my CCNA Wireless cert, but I’ve not really been sure what to post…until now.

I install quite a lot of Cisco wireless in factories and although I’m new to Ekahau (any complimentrary Ekahau training would be awesome) recently had the opertunity to test the attenuation of some insulation.

The kit I used to test was:

I tested two types of standard foam insulation that I currently cant name but here are the results:

Here’s the free path loss from the AP to the sidekick

Here’s with some insulation in the way:

So,

free loss  -46dBm

insulation = 2.4m in length and 2m depth and 3m height.

loss = -45dBm

= 1dBm loss!

 

Second piece of insulation:

free loss -58dBm

insulation = 2.4m in length and 2m depth and 3m height.

insulation -52dBm

= 6dBm loss.

 

So all in all I’d say depending on the insulation there can be quite a lot of attenuation.

Citrix and vCentre

Posted by robd on September 05, 2018
Citrix, vmware / No Comments

Annoyingly our venctre broke recently meaning our Citrix clients wouldnt boot which had the knock on affect users couldnt logon.

To easily check the connection status of citrix and vcentre, you can run the following PowerShell command on a Citrix delivery server (or whereever Citrix PS is installed):

This is what it looks like when its broken, notice the State:

Fixing vcentre and rebooting the citrix server it then looks like this:

 

Tags: , , ,

Office 365 to Exchange 2010 on prem calendar free/busy information

Posted by robd on August 22, 2018
exchange 2010 / No Comments

Hello,

Preface this post by saying a man from Exchange support said “This is the most complicated Exchange environment I’ve ever seen”.

That said this issue is pretty common and hopefully this post will help someone else.

We have an Exchange 2010 to 365 hybrid environment that look a bit like this:

We had an issue where users on our 365 tenancy couldn’t see on the Exchange 2010 on-premises free/busy info for users in Group2.contoso.com.

Now I know what your thinking, just compare the settings on Group1 to Group2, well due to company rules and politics I can’t….I can only troubleshoot group2 and the servers there.

So first things first, check users permissions and setup a test user and find the error in Outlook:

Thanks to Babunski and his post I found this really good troubleshooting guide and everything looked ok:

https://support.microsoft.com/en-us/help/10092/troubleshooting-free-busy-issues-in-exchange-hybrid-environment

  • Firewall is fine,
  • Network is ok,
  • DNS surprising is working,
  • Check Exchange online tool:

https://www.testexchangeconnectivity.com

  • 365 to prem relationship is ok:

  • IIS Logs look ok, %SystemDrive%\inetpub\logs\LogFiles

  • EWS logs look ok, %SystemDrive%\inetpub\logs\LogFiles

  • Checked the external URL – seems ok.

  • Check the IIS permissions with – this looked ok

  • Checked IIS EWS and Autodiscover:

  • Checked more relationship stuff – all ok

Next – contact support!  🙁

 

Before I contact support I did find one more URL that suggests to check the certs and import the cert you used to setup the federation onto the CAS server which unfortunately didn’t work for us:

https://support.microsoft.com/en-gb/help/3057905/exchange-online-users-cannot-access-free-busy-information-of-users-in

 

Soooo here I am, time to contact support.

 

The first thing they checked was the local url on the client access server:

https://ClientAccessGroup2Server1.group.contoso.com/ews/exchange.asmx

So there’s an issue, basically we didn’t add the server to our wild card cert.  So added the server names as Subject alternative names and imported it using PowerShell onto both Client access servers and then rebooted:

Fixed:

Checked the URLS set in Exchange:

Our internal URL was actually set to Client Access array for Contoso rather than group2.contoso.com so we changed this:

And rebooted again.

 

Next we disabled and re-enabled ISS security (this broke OOF for a while, we had to run this twice):

So here are stuck…..

 

MS ran some traces using Extra:

 

And went away for a while and came back with:

Internet facing Site Conotso.com is able to look up the user and send a request to Group2 servers.

The request failed with an empty response.

On group2 server we notice below error,

 

So the long and short of it is they think IIS is broken. The traffic is being passed to the Group2 services but these services are not passing the information back up the stream.

 

MS decided they wanted swap out the EWS web.config with a new one from:

 

The reason being in the config file it was referencing:

And it should be referencing (or where ever you install of Exchange is):

And another reboot.

 

Next we checked the logging from Outlook:

Which dumps files too: %Temp%\outlook logging

And they found this error:

 

This prompted MS to check the IIS bindings which were wrong

So we added some missing bindings using these command:

No change and still the same error: Response Code ErrorProxyRequestProcessingFailed

 

So we checked windows services and would you believe it but these dot net services were not installed:

net.tcp lisener adapter

net.pipe listener adapter

 

So we installed the missing features:

Rebooted both.

And Boom we are working!!!!!!!!!!!!!!!!!!!!!

Tags: , , , ,

Ratio of Physical CPUs to Virtual CPUs in VMware

Posted by robd on August 06, 2018
powershell, vmware / 1 Comment

My colleague Welsh Dai made this sweet bit of PowerShell to see the ratio of physical CPUs to Virtual CPUs:

 

Here’s a picture

Tags: , , ,

Auditing Active Directory Password Quality

Posted by robd on April 24, 2018
Active Directory, powershell / No Comments

Hi All,

A chap called Michael Grafnetter has created a brilliant PowerShell script to check password hashes in Active Directory against a list of simple or common passwords.

This is great to encourage users not to use obvious passwords, for example if a company is called Contoso then you’d want to encourage users not to use Contoso1 etc.

Here’s how:

Download the software:

https://github.com/MichaelGrafnetter/DSInternals/releases/tag/v2.22

Copy the DSInternals directory to your PowerShell modules directory, e.g.

Launch Windows PowerShell.
(Optional) If you copied the module to a different directory than advised in step 4, you have to manually import it using the Import-Module .\DSInternals\DSInternals.psd1 command.

Next create a text file called passwords.txt and fill it with passwords you’d like to scan for, example:

Then here’s an example script:

First set the password txt file.

Then set the Domain Contoller, in this case DC1

Then set the distinguished name of the OU and sub OUs you can to scan:

Note ” and ‘ are not showing up properly,

$dictionary = Get-Content passwords.txt | ConvertTo-NTHashDictionary Get-ADReplAccount -All -Server DC1 -NamingContext ‘dc=adatum,dc=com’ | Test-PasswordQuality -WeakPasswordHashes $dictionary -ShowPlainTextPasswords -IncludeDisabledAccounts

Here’s an output:

Tags: , ,

365 – Shared Mailbox on a mobile device

Posted by robd on February 06, 2018
Server / 1 Comment

Some users need shared mailboxes on their mobile devices, this can be done via IMAP.

Add a IMAP:

Add the shared mailbox email:

Choose IMAP

This is the most important section; add the user’s username and the name of the shared mailbox, for example: Rob@DOMAIN.LOCAL/SHARED.MAILBOX

Tags: ,

Plex on Ubuntu

Posted by robd on December 11, 2017
Linux, Plex / No Comments

I’m no expert with Linux but I’m trying hard to improve my knowledge, I recently ran through some great CentOS videos on Pluralsight and after that tried to install Guacamole which is a clientless remote desktop gateway.  The long and the short of it is I didn’t get it fully working but really enjoyed the process.

Anyhow as I mentioned in a previous post, I decided to install Plex on a Ubuntu server as I think my problem with Linux is the lack of visual prompts i.e. If I can see or draw something then I often understand the process better.

So after my initial vmware issues, I downloaded and installed Ubuntu and installed VMware tools:

Open the VMware Tools CD mounted on the Ubuntu desktop.
Right-click the file name that is similar to VMwareTools.x.x.x-xxxx.tar.gz, click Extract to, and select Ubuntu Desktop to save the extracted contents.

The vmware-tools-distrib folder is extracted to the Ubuntu Desktop.
To install VMware Tools in Ubuntu:
Open a Terminal window. For more information.
In the Terminal, run this command to navigate to the vmware-tools-distrib folder:

Run this command to install VMware Tools:

Note: The -d switch assumes that you want to accept the defaults. If you do not use -d, press Return to accept the defaults or supply your own answers.

Enter your Ubuntu password.
Restart the Ubuntu virtual machine after the VMware Tools installation completes.

Then I updated Ubuntu:

Next download the Plex Media server package

run from a terminal:

(replace the filename with the name of the package you downloaded)

To setup Plex Media Server, on the same machine you installed the server on, open a browser window, and go to http://127.0.0.1:32400/web.

Then I decided it best to run Plex as a service so if the server rebooted I wouldn’t have to logon:

Finally I need to map a drive so I could access the media (photos etc) on my windows server again in a fashion where if the server rebooted it would map.

To do this you need cifs utils to connect to windows shares:

Then you need to create a directory to mount the share too like so:

sudo mkdir /media/windowsshare

Then add the the share and add the Windows credentials that have permission to the share via the config file /etc/fstab, to add this line:

Finally, test the fstab entry by issuing:

If there are no errors, you should test how it works after a reboot. Your remote share should mount automatically.

See if you have access via Plex.

Tags: , ,

Copy Protected by Chetan's WP-Copyprotect.