Google failed me....When I was thinking about how to start off this blog post I ran the gamut of jokes, quips and amusing anecdotes but in the end I felt it fitting to start with my disbelief. In my years of IT Google has always had all the answers...Mind you not always EXACTLY how I've wanted them, aka: spending an afternoon modifying code, using some random error that was kind of like mine to fix an issue or getting the answer "you can't do that you shouldn't have taken on this project/problem in the first place".  But this time nothing. There was plenty of material about JDBC drivers or Hive with ODBC drivers outside of  Amazon Web Services Elastic Map Reduce but nothing that worked for my instance and certainly nothing that was going to help me solve my problem. So I will add to the almighty Google and help it fill the void that I tragically found.

First, and foremost let me explain my situation a little to give you a general overview of what we are doing here. I have an Amazon Web Services (AWS) account and am using it to spin up Elastic Map Reduce (EMR) instances. EMR is (according to Amazon) :
Amazon EMR is a web service that enables businesses, researchers, data analysts, and developers to easily and cost-effectively process vast amounts of data. It utilizes a hosted Hadoop framework running on the web-scale infrastructure of Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Simple Storage Service (Amazon S3).
Basically, what all this means is EMR is cloud Hadoop. If that doesn't mean anything to you then you probably aren't really in need of this post to begin with. My issue came when I was trying to use Microsoft Excel to attach to Hive (A component of the HADOOP installation) and view the tables, columns, etc that were being spit out to me by my HADOOP processes. Mind you this is a over simplification for what is going on, but I'd rather get to the good stuff instead of spending all day explaining what very little I know about HADOOP and its processes. So without further ado here is the process I used to get my ODBC driver set up with HIVE on my AWS EMR instance. (That was a mouthful -ed.)

First we will need to download the ODBC driver that is available through AWS. This can be found at : . Download the ODBC driver that is necessary for your environment. In my case

Once this has been downloaded we will install it on our machine. This is a pretty straight forward process. Make sure we choose the correct version of the product. Windows is my OS, 64bit is what my machine is running at home. Next, Accept EULA, Next, Installer Location, Next, Install, Finish

After the installation we will need to launch an EMR cluster, if one is already running don't worry about this step. NOTE: many times ODBC connections to HIVE will call for a Thrift server. By default if you have HIVE installed on your EMR cluster Thrift will be installed as well and this will be moot.

Once we have made sure our new cluster is provisioned and running we will need to connect to it. We will start with the normal steps that we find on the AWS site for establishing SSH connections.:

If using Putty, after selecting the session option on the left, add the Hostname (, change the hashes with the servers public IP)  and the Port (22)

We will then select the plus next to SSH and expand it. From here we will go to Auth. Under "Private key file for authentication" put the key that is associated with the cluster we have launched and added to our "hostname" in the above step. 

Next, click on the "Tunnels" on the left hand side. this is where things will deviate from the original instructions as outlined by Amazon. We will still add our 8157, by default, tunnel for the connection to the web tools for our EMR instance but we will add on additional tunnel. This tunnel will be for the connection we will be making with our ODBC driver to HIVE on the machine. By default this tunnel needs to sit at port 10000. This is determined by the installation version of HIVE on the EMR instance. For our purposes, we will select Local and Auto which can be found BELOW "Destination". In the "Destination" section we will add the hostname of our cluster sans the hadoop@ that proceeds it. We will then at the end of our hostname add :10000 with 10000 being the tunnel we are establishing. ( 10000). In source port we will also add 10000. At the end of the day, prior to clicking "Add", It should all look something like this.  

Once this is to your liking click "Add" and it will be displayed in the box above "add new forwarded port:". For the sake of speeding these instructions up we will also add our 8157 port with Dynamic and Auto selected, as per Amazons original guidelines. 

Once these steps have been completed select Open. to pull up your HADOOP SSH instance. 

We will now want to verify that we can reach the port that we just opened up through SSH to the server. The easiest way to do this is with a simple telnet command. So open up command prompt, or powershell if you're into that kind of thing, and type in (telnet 1000). If all goes well you should be presented with an empty prompt box...otherwise you will get an error and will need to check security settings on both your local computers side and the AWS VPC side. 

Without keen insight into any one given persons environment getting this telnet session to establish will be different from environment to environment. The best advice I can give you is to make sure your ports are allowed through Windows Firewall, AntiVirus, AWS VPC Security settings (Inbound and Outbound) and then lastly with your companies physical firewall settings. 

Now that the basics are taken care of we can configure the ODBC driver. First, find your ODBC Data Sources (32-bit or 64-bit) program under administrative tools and open it. I will be using the 64 bit version of the tool since I have 64 bit Microsoft Excel that I will be accessing the ODBC connection with.

Once this is pulled up we will create a new ODBC driver by going to "System DSN" and clicking "Add"

In the window presented we will select the "Amazon Hive ODBC Driver" option and click "finish".

We will now start adding the information we know into the "Amazon Hive ODBC Driver DSN Setup" Window. Our "Data Source Name" can be whatever we want. The "description" is also trivial. For the "host" we will once again add our Hostname ( and we will change the port to the port that we added for HIVE, in our case 10000. For the database we will need to put in whatever database we have created or used, I'm just using default, and for "Hive Server Type" we will put in "Hive Server 1". NOTE: If this does not work try using "Hive Server 2". I have had problems where "Hive Server 1" does not allow me to put in my User Name and Password. 

Moving onto Authentication our Mechanism for Authentication will be username and password. By default this will be set to "user name" : emr "password" : emr. 

Once these steps are completed click Test. 

If everything is happy we should return a "TESTS COMPLETED SUCCESSFULLY!" message. Select OK to close the test window and OK again to accept and save the changes to our DSN setup. 

With all these steps completed its a simple Datasource addition into Excel before you are enjoying your HIVE data in your Microsoft Excel spreadsheet. You're welcome Google. 

Back in the day, which was a Wednesday for those of you who are interested, we had a less than wonderful browser call Internet Explorer 8. This browser like the comedian I stole that last joke from, Dane Cook if you are interested, was popular only because we didn't know any better and it was easy to get to by default. But like all terrible things sometimes "tweaking" was necessary in order to get the software to perform like we needed it to. One such "tweak" was the enabling of cookies, which is required more often than not by certain websites. In order to enable cookies in IE 8: 

Recently, I have been going through a number of my departments procedures. It seems that during our windows patching cycles updates continuously fail on servers due to insufficient C: drive space. While cleaning up the necessary files for patching to complete is far from difficult, it is relatively monotonous. With this in mind I set out to write a simple script to clear many of the temporary file locations on Windows 2003, 2008 and 2012 servers, thus automating the C drive cleanup process for my team and I:

NOTE: After copying and pasting this script into a .txt file please make sure to rename/save as filename.bat (filename can be changed to whatever title you would like the script to be named - ed.). This process might require you to "show hidden files and paths" to verify the file is named correctly.  Also, make sure to run this script  as administrator from the server you wish to cleanup. This can be done simply by right clicking the script ([filename].bat) and selecting "run as administrator"

REM remove any dump files or temp files
DEL %SystemDrive%\*.TMP /S
DEL %SystemDrive%\*.DMP /S

REM delete past KB updates from Microsoft
forfiles /p "%WinDir%\$hf_mig$" /s /m *.* /c "cmd /c Del @path" /d -180

REM delete Performance Information and Tools logs and Reliability and Performance Monitor logs
forfiles /p "%SystemDrive%\perflogs" /s /m *.* /c "cmd /c Del @path" /d -180

REM remove old installation of TSM backup agent installation
rmdir /s /q %SystemDrive%\tsm_images

REM remove all files found in the windows temporary file location
del /f /q "%temp%\*.*"

REM remove file used for past Windows service pack installations
if exist "%WinDir%\servicepackfiles\i386" DEL "%WinDir%\servicepackfiles\i386\*.*" /s /q

REM if the user folder exists remove specific file types if they are found to clear up space. If not jump to skip.
if not exist "%WinDir%\Users\*.*" goto skip
if exist "%WinDir%\Users\*.zip" del "%WinDir%\Users\*.zip" /f /q
if exist "%WinDir%\Users\*.gif" del "%WinDir%\Users\*.gif" /f /q
if exist "%WinDir%\Users\*.jpg" del "%WinDir%\Users\*.jpg" /f /q
if exist "%WinDir%\Users\*.png" del "%WinDir%\Users\*.png" /f /q
if exist "%WinDir%\Users\*.bmp" del "%WinDir%\Users\*.bmp" /f /q
if exist "%WinDir%\Users\*.avi" del "%WinDir%\Users\*.avi" /f /q
if exist "%WinDir%\Users\*.mpg" del "%WinDir%\Users\*.mpg" /f /q
if exist "%WinDir%\Users\*.mpeg" del "%WinDir%\Users\*.mpeg" /f /q
if exist "%WinDir%\Users\*.ra" del "%WinDir%\Users\*.ra" /f /q
if exist "%WinDir%\Users\*.ram" del "%WinDir%\Users\*.ram"/f /q
if exist "%WinDir%\Users\*.mp3" del "%WinDir%\Users\*.mp3" /f /q
if exist "%WinDir%\Users\*.mov" del "%WinDir%\Users\*.mov" /f /q
if exist "%WinDir%\Users\*.qt" del "%WinDir%\Users\*.qt" /f /q
if exist "%WinDir%\Users\*.asf" del "%WinDir%\Users\*.asf" /f /q

REM if the users\users folder exists remove specific file types if they are found to clear up space. If not go jump to tempFiles
if not exist %WinDir%\Users\Users\*.* goto tempFiles /f /q
if exist %WinDir%\Users\AppData\Temp\*.zip del %WinDir%\Users\Users\*.zip /f /q
if exist %WinDir%\Users\AppData\Temp\*.gif del %WinDir%\Users\Users\*.gif /f /q
if exist %WinDir%\Users\AppData\Temp\*.jpg del %WinDir%\Users\Users\*.jpg /f /q
if exist %WinDir%\Users\AppData\Temp\*.png del %WinDir%\Users\Users\*.png /f /q
if exist %WinDir%\Users\AppData\Temp\*.bmp del %WinDir%\Users\Users\*.bmp /f /q
if exist %WinDir%\Users\AppData\Temp\*.avi del %WinDir%\Users\Users\*.avi /f /q
if exist %WinDir%\Users\AppData\Temp\*.mpg del %WinDir%\Users\Users\*.mpg /f /q
if exist %WinDir%\Users\AppData\Temp\*.mpeg del %WinDir%\Users\Users\*.mpeg /f /q
if exist %WinDir%\Users\AppData\Temp\*.ra del %WinDir%\Users\Users\*.ra /f /q
if exist %WinDir%\Users\AppData\Temp\*.ram del %WinDir%\Users\Users\*.ram /f /q
if exist %WinDir%\Users\AppData\Temp\*.mp3 del %WinDir%\Users\Users\*.mp3 /f /q
if exist %WinDir%\Users\AppData\Temp\*.asf del %WinDir%\Users\Users\*.asf /f /q
if exist %WinDir%\Users\AppData\Temp\*.qt del %WinDir%\Users\Users\*.qt /f /q
if exist %WinDir%\Users\AppData\Temp\*.mov del %WinDir%\Users\Users\*.mov /f /q

REM remove temporary debug and cache files for IIS installations
del "%SystemDrive%\inetpub\wwwroot\cache\*.*" /f /q
del "%SystemDrive%\debugdiag\*.*" /f /q

REM remove any un needed past Intel installation files from computer.
if exist %SystemDrive%\Intel rmdir /S /Q %SystemDrive%\Intel
if exist %SystemDrive%\i386 rmdir /S /Q %SystemDrive%\i386

REM removes patch catch folder location found in Windows Server 2008 and above
rmdir /s /q "%WinDir%\Installer\$PatchCache$\"

REM If the location exists recurse through all User folders and cleanup temporary file locations(Win 7, 8)
IF EXIST "%SystemDrive%\Users\" (
    for /D %%x in ("%SystemDrive%\Users\*") do (
        rmdir /s /q "%%x\AppData\Local\Temp"
        mkdir "%%x\AppData\Local\Temp"
  rmdir /s /q "%%x\AppData\Local\Microsoft\Windows\Temporary Internet Files"
  mkdir "%%x\AppData\Local\Microsoft\Windows\Temporary Internet Files"

REM If the location exists recurse through all documents and settings files and cleanup temporary file locations(Win XP)
IF EXIST "%SystemDrive%\Documents and Settings\" (
    for /D %%x in ("%SystemDrive%\Documents and Settings\*") do (
        rmdir /s /q "%%x\Local Settings\Temporary Internet Files"
        mkdir "%%x\Local Settings\Temporary Internet Files"
  rmdir /s /q "%%x\Local Settings\Temp"
        mkdir "%%x\Local Settings\Temp"\
  forfiles "%%x\application data\vulscan" /s /m *.log* /c "cmd /c Del @path" /d -90
  forfiles "%%x\application data\vulscan" /s /m *mergedgetvulnerabilities* /c "cmd /c Del @path" /d -10

REM run windows disk cleanup on the C drive.
cleanmgr /sagerun:1 /c

This script is now being used by the majority of my team members and has taken a job that can take up to 10 minutes and reduced it down to under 1. Our server monitoring team is even looking into pushing this script every time it is determined that a servers main drive falls under 800MB of free space which could save our team even more time.

My office recently ran across an issue which prevented users from being able to search Lync by username or ID. The only way around this was to run searches with users email accounts after upgrading to Skype for Business 2015 (Lync 2013).

When working on some of our older systems a week back I came across an IDRAC that refused to cooperate. Now it could have been that everyone in the office had forgotten the password but I think it was that the IDRAC itself was not remembering its intended function. With this in mind I was not surprised when the IDRAC threw us an error saying "it had reached the maximum user sessions" and no other users could log on. I knew as well as everyone else in the office that the maximum user sessions had not been reached but that didn't mean that we hadn't hung the device with our incessant login attempts. So, due to our incorrect logon persistence, we got the chance to login to the server through RDP in order to clear out the offending IDRAC "sessions":


  • IDRAC 7 
  • Access to server OS attached to IDRAC 
First, log onto the corresponding server.

Open a command prompt as administrator by right clicking cmd.exe and then selecting "Run as administrator"

Go to the IDRAC directory through the command prompt window (c:\
program files (x86)\dell\sysmgt\idrac) by typing in the the following command: cd c:\program files (x86)\dell\sysmgt\idrac.

Next use the following command:  racadm -r [servername] -u -p racreset

This will cause the IDRAC to reset and allow the user to log back in.

racadm can also be used for several other operations on a Dell IDRAC. For a complete list of these commands its best to go to Dell technical support for the most up to date list of commands for your specific IDRAC version
Next PostNewer Posts Previous PostOlder Posts Home