Category: Pro Tips

My todo list for work in 2016

Its 2016 and I like to make a list of things I wanna look into and put into place at work so here is a list of things I am aiming to do which also includes at my job in 2016:-

  • A blog post each week, last year I only managed 11 blog posts and that’s poor, so more blog posts will come in 2016.
  • Lightning Talks, we have started doing these at work and this year I plan to do a few of them if given the chance, I’m doing one on the 6th of January on as Developers Top 10 Best Practices which I will share about once I have done my talk.
  • Test 3rd party end points, have a dashboard page which tells us what is up and what is down, going to try to use the Chrome add-on called Postman and use collections within Postman to do this.
  • Database Deployments, currently we manually script everything and then get the dba to run the scripts in manually, we need to script the database and put it into Source control each release as a starting point, I’m looking forward to this as it well help aid with our deployments and speed them up. Hoping to get the RedGate Sql Toolbelt into the company so we can use this to help us achieve this.
  • Red/Green deployments, we currently deploy at weekends and this can and should change so that we aren’t spending time at weekends doing releases which take quite a lot of time, we can automate them more and this year I plan to fix that.
  • More Testing, we are closing in on unit testing our PowerShell scripts, this will be another nice addition to the number of different areas which we are currently testing which is great.
  • Code Reviews, need to figure out a way that keeps everyone from being bored, brings benefit to the team and keeps us developers on our toes going forward.
  • Continuous Improvement, test more, more in-depth code reviews, automate more, release finished work, tackle the back log each sprint.

That’s it for now, I will add to this lost throughout the year as we go, i will keep an eye on this post and how we get on and blog about each one individually, hopefully I’ll get the chance to work on some if not all of these this year.

Feel free to follow me on twitter at @gsuttie



Choosing what Technologies/Frameworks to use

TechnologyBasics I have seen a few of my good friends on twitter discussing what technologies/frameworks they use and why they chose them.

I have also seen people saying why they should use this technology/framework over that technology all with really good arguments, as well as people asking which technologies/frameworks/libraries they should look at.

Note:- This is from the perspective of a company and not an individual developers stand point.

It comes down to a number of things and twitter isn’t really the best place to discuss this as it’s too small to get your point across hence this blog post.

I have worked in a few different places over time and there really is no single answer, some financial place I was in you were told which technology you were going to be using and it wasnt even up for debate, software architects who you would never even see had made the decisions already whether that be a good idea or not.

I would say choose the technology/framework that fits your current criteria as well as the technology/framework that matches your teams skill sets.

There has been a few posts about MVC or Nancy, in the end who cares what you choose? – if it gets the job done right?

Perhaps if you’re in a small team then its easier to choose say Nancy over MVC perhaps? – but if you walked into a team of 50-100 developers in a company im betting more people know MVC than know Nancy, that’s not to say MVC is better than Nancy, I havent even looked at Nancy, why not? – for a couple of reasons, why do I want to look at Nancy when MVC is used through our solutions, do I want to spend time learning another tool that does the same job? – no I don’t have time, don’t get me wrong I love to look at new ways of doing things, but I want to spend time learning something else like KnockoutJS or RavenDB and expand on my skill set, id rather do that port perfectly fine code bases which use a number of tools that run on MVC to Nancy and then have to check everything still works as expected including tests, build scripts, all sorts of other tools we use like chirpy and T4MVC.

People wonder why companies choose Microsoft products over other similar products and from experience it’s usually for either the option of MSDN licence support if required, or that more developers out there know and understand the product therefore there is more chance of help/finding a solution to a nasty bug – sort of strength in numbers it’s also easier for a manager to say to their boss look we will use SQL Server instead of CouchDB because of the products past history and thy can sleep safe in the knowledge its proven – I am not saying that other products are brand new and not to be trusted far from it, but I hope you get the point.

Recently we started using CoffeeScript for our latest project and to be honest I have never liked it, it was good for organising code in a better structured manner and I will leave it at that – we are looking to move to TypeScript and this isn’t because it’s a Microsoft product, its open source and I just think it’s so much nicer due to the tooling. It’s a better too for the job in my opinion and in summary there is no magic answer to what the best technology is, decide upon the variables in play and go from there is what I would suggest.

Not everyone will agree so feel free to add a comment.



Pro Tip : SQL Tips

Calling a udf from SQL Query Analyser
SELECT dbo.udf_calculate_working_days(’01/01/2000′, ’01/01/2001′)

Calling a Table valued function from SQL Query Analyser
SELECT * from dbo.udf_calculate_working_days(’01/01/2000′, ’01/01/2001′)

Get the number of days between 2 dates
SELECT DATEDIFF(dd, ’01/01/2009′ , ’01/01/2010′) as ‘total_days’

SQL for when checking against todays date
SELECT DATEADD(dd, DATEDIFF(dd,0,@order_date), 0



Pro Tip: Being more productive with ReSharper Live Templates

Whilst watching a @tekpub video about Roy Osherove’s TDD Masterclass I saw a tip he gave where he showed how to create a ReSharper Live Template to save time and be more productive, although this was related to TDD you can create your own user template for anything you like, this tip is for creating NUnit test methods but the main thing is the idea, here is what he showed:-

1 – Install ReSharper.
2 – Then from within Visual Studio, select ReSharper–> Live Templates.
3 – Check the box next to User Templates and then, click the Add icon or select New Template.
4 – Give the Template a name and then enter content as the below screenshot:-

Test Template

Once this is saved if you back to your class and type test and then hit the tab key you’ll see the empty [Test] method generated and it asks you to supply the name.

Try this out and you will soon see the potential time savers you can come up with.



using the exchange web service API from c#

This past week I have been looking at the exchange web service API and how we can inspect Emails within Exchange. If you need to read emails from an inbox then you can very easily and quickly by using the exchange web service API which you can download from here.

I was looking for a way to check a folder for emails and then look at the xml file attachments and then do some work on the contents of the attached files – once done with the contents then mark the email as read and then move the email to an archived folder.

In the following example lets assume we will have emails which will be directed into a folder we specify using a rule, the name of the folder will be stored in the web.config file so we can make this configurable at any point, for arguments sake lets call this folder ExchangeAPIDropFolder.

The following code demonstrates how to go about this:-
[sourcecode language=”csharp”]

private static void CheckEmailFolderForContents()
{
string exchangeUsername = ConfigurationManager.AppSettings["ExchangeUsername"];
string exchangePassword = ConfigurationManager.AppSettings["ExchangePassword"];
string exchangeAutodiscoverUrl = ConfigurationManager.AppSettings["ExchangeAutodiscoverUrl"];
string exchangeAPIDropFolderFolderName = ConfigurationManager.AppSettings["ExchangeAPIDropFolderFolderName"];
string exchangeAPIDropFolderArchivedFolderName = ConfigurationManager.AppSettings["ExchangeAPIDropFolderArchivedFolderName"];

ExchangeService ews = new ExchangeService(ExchangeVersion.Exchange2010_SP1)
{
Credentials = new WebCredentials(exchangeUsername, exchangePassword)
};

ews.AutodiscoverUrl(exchangeAutodiscoverUrl);

FindFoldersResults folderSearchResults = ews.FindFolders(WellKnownFolderName.Inbox, new FolderView(int.MaxValue));

Folder exchangeExchangeAPIArchivedFolder = folderSearchResults.Folders.ToList().Find(
f => f.DisplayName.Equals(exchangeAPIDropFolderFolderName, StringComparison.CurrentCultureIgnoreCase));

//Set the number of items we can deal with at anyone time.
ItemView itemView = new ItemView(int.MaxValue);

foreach (Folder folder in folderSearchResults.Folders)
{
if (folder.DisplayName.Equals(exchangeAPIDropFolderFolderName, StringComparison.OrdinalIgnoreCase))
{
Folder boundFolder = Folder.Bind(ews, folder.Id);

SearchFilter unreadSearchFilter =
new SearchFilter.SearchFilterCollection(
LogicalOperator.And, new SearchFilter. IsEqualTo(
EmailMessageSchema.IsRead, false));

//Find the unread messages in the email folder.
FindItemsResults<Item> unreadMessages = boundFolder.FindItems(unreadSearchFilter, itemView);

foreach (EmailMessage message in unreadMessages)
{
message.Load();

foreach (Attachment attachment in message.Attachments)
{
if (attachment is FileAttachment)
{
FileAttachment fileAttachment = attachment as FileAttachment;
fileAttachment.Load();

MemoryStream ms = new MemoryStream(fileAttachment.Content);
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.Load(ms);

//TODO – Process File Contents
}
}

//Mark the message as read and then move it to the Archived Folder
message.IsRead = true;
message.Update(ConflictResolutionMode.AlwaysOverwrite);
message.Move(exchangeExchangeAPIArchivedFolder.Id);
}
}
}
}
[/sourcecode]

Thats it – enjoy and feel free to add comments or ask me about this code.



How to Source Control your SQL Database

In this blog post I am going to show you how to add a SQL Server database to Subversion using Redgate SQL Source Control, the tool claims you can add your database to a source control provider such as Subversion in about 5 minutes – lets see whats involved.

I have Subversion running on my own windows 7 pc at home, I use it for testing out tools and integration with CI and so on, in future blog posts I will cover more on Continuous Integration.

I’m going to start by adding an existing database into Subversion using the SQL Source Control tool with screenshots so you can follow. I currently have Subversion installed and running on c:\svn\trunk\

Below is a screenshot of my local SQL Server

screenshot1

screenshot1

SportsStore is a local database used within one of the MVC books I have, lets add this database to Subversion using the tool now.

Create a New Folder which needs to be empty within your local repository folder, in my case my subversion folder is c:\svn\trunk\ – this is where your database scripts and data will be where your scripts in source control will be locally.

screenshot3

screenshot3

Highlight the database on the left and then select the blue text on the right which says “Link database to source control…”

The following window will pop-up, here enter the url to your repository, since mine is local I just need to add c:\svn\trunk\SportStoreDB (this is the folder your db scripts will go into), click go.

screenshot2

screenshot2

You should see the following :-

screenshot4

screenshot4

At this point the tool has added all the scripts including the data for all your tables to Subversion – now we just need to get this from Subversion into our local folder – to do this right-click on the folder and choose SVN Checkout and then click ok as below

screenshot5

screenshot5

This now leaves us with the entire SportStore database having been scripted and placed within your subversion repository and you now have a local copy – perfect and easily accomplished.

Below are the screenshots of the output from the tool within your Subversion folder.

screenshot6

And a view of the stored procedures folder with created scripts for you.

screenshot7

For more on this great tool check out Redgate SQL Source Control.



Find Out Your Project Build Times in Visual Studio

Recently I was looking to try and work out why a project I was working with was taking a very long time to build – I was pleased to find that Visual Studio can out put individual project build times for a solution, this allowed me to see which projects within the solution were taking longer that I would have anticipated.

To View the build time of projects within a solution go to Tools, Options and look for Projects and Solutions, then look for Build And Run (or similar depending on the version of Visual Studio) and then change the drop down for MSBuild project output – change the loggig from Normal to Diagnostic.

Diagnostics Option

Now as the solution is building go to the View mwnu and select OutPut (to view the ouput window) and you can see a whole raft of build details including build times for each project.

Build Times

Hopes that useful to someone.



Quick SQL Server Tip – Creating your Insert Stored Procedure

When I start to create the insert stored procedure for an existing database table I used to always start by copying an existing stored procedure (for the template only purpose) and then empty the contents of the stored procedure, then I would right-click on the table and then select, script table as ‘Insert’ from the menu options as shown below:-

Insert into a table

This template always kind of annoyed me as I felt it never laid it out very well, only last week did someone point out that the best thing to do is to actually just type inside your stored procedure:-

INSERT INTO and then press Enter

This will generate the insert and also tell you the column types and sizes for variables such as varchar(s) as shown below, including putting in default dates.

Insert syntax

That’s it for now.



Quick Tip: Tracing SQL from your application

Tip: When your debugging some code and you want to trace the SQL that’s hitting your SQL Server Database then this is how I would normally cut through all the traffic hitting the SQL database your using.

Tip:- Add an application_name part to your connection string in the app.config or web.config, here is an example.

Before:-
add name=”TestConn” connectionString=”data source=ServerName;Integrated Security=SSPI;initial Catalog=TESTDB;providerName=”System.Data.SqlClient”

After:-
add name=”TestConn” connectionString=”data source=ServerName;Integrated Security=SSPI;initial Catalog=TESTDB;application name=;” providerName=TESTAPP_GREG”System.Data.SqlClient”

Couple of things to notice in the lines above – the second connection string I added an application name, but I also added my name so that when I run the application I can use the application name setting with my name so I only trace my code when I hit the database and no other code being ran will be shown within my trace window.

To do this I usually fire up SQL Server Management studio and then from within their fire up SQL Server Profiler and do the following (make sure to click Show All Columns):-

SQL Profiler

Then add the name of the application name into the box as below:-

SQL Profiler

Click ok and the only trace information recorded now will be the SQL that you are actually hitting the database with and no one else – enjoy.