Tag data access

Meet and Code – Weekend in Review

I spent my weekend in my company’s office. Keep in mind that 1) I’m a consultant and I’m usually at a client and 2) I didn’t get paid. Why would I spend my weekend at work? Meet and Code

What’s the point of Meet and Code? The point is for a bunch of people (developers, DBAs, UI people, PMs, BAs, etc) to get togther and work on a project to learn, build new skills, and do it right the first time. Sometimes, the real world constraints of deadlines and budgets don’t always give us the time to do TDD and to do things properly. So we rush through things and have to test manually. Not at Meet and Code. Instead, we’re doing things using agile practices, TDD, and rapid development techniques so that we can practice the craft that we preach about.

See, a few months ago someone had the crazy idea that we should get together and work on a software project over the weekend. Now, that catch was that we had to recruit other people. So, we recruited other co-workers. Eventually we’re planning on recruiting the rest of you, but bear with me.

After we recruited a team of suckers volunteers, we did a few planning meetings before everything got started and decided on a product to work on (an event RSVP web application so we could manage the next time we do this). We also figured that everything would take about a weekend (Friday 5pm – Sunday 5pm).

Friday was pretty much a wash. There was a TDD and mocking demonstration that was incredibly helpful, especially since I haven’t done TDD in a long time and mocking is just right outside of my area of expertise. Most people didn’t have development environments set up on their computers, there was some confusion about who was responsible for determining what, but things got ironed out by the end of the night. Towards the end of the night, I started working on the database and the UI people were able to get started. I left around 12:30 or so. Things were pretty disorganized, but it was a good start.

We finally got a build server set up early Saturday morning. I kept churning out database code while the business layer was being put together. There was still a bit of confusion, but by the end of the day things were going full steam. People came and went, but that’s what we expected.

The office was pretty humid and smelled distinctly of feet by the end of the night. It didn’t help that the air conditioning is turned off on the weekends.

By Sunday, I was starting to run out of steam. It’s tough to make it through a week of your day job and then run through a development sprint on the weekend, but the fun and learning made it more than worth it.

How did it turn out?

It went well. There were definitely some initial teething problems – development environtments weren’t set up, use cases weren’t fully fleshed out – but we made it through them.

Going forward there are definitely some things that we could do differently to make things run smoother:

  • Install fest – We should have held an early install night where we focus on getting everyone’s development environment prepared. This would prevent problems with service packs installing and getting different versions updated.
  • Training materials/demo – In addition to the install night, having fully built a single feature to demonstrate the ‘proper’ way to use ASP.NET MVC would have been very helpful for everyone.
  • Dedicated user representation – We initially had some BAs present on Friday night, but they had other weekend plans. It was difficult at times to resolve some ambiguity in the use cases or to determine how a feature should really function. If someone else was designated as the user representative, we would have been able to defer to the user rep on all of these issues.

However, despite these teething problems, we had a lot of fun and I definitely think we’ll be repeating this a few more times before we take it completely public with some hints and tips on our methodology for running the event smoothly.

Links for the Week of 2009-01-06

SQL Server

Tomorrow’s Microsoft BI Platform Derek Comingore (SQL Server MVP) gives a great overview of the Kilimanjaro release of SQL Server. This is a solid overview of the pieces and parts that will make up the 2010 feature pack release before the release of SQL 11 (in 2011).

Top 10 SQL Server 2008 Features for the Database Administrator Mike Weiner and Burzin Patel put together a great list of features for DBAs in SQL Server 2008. A lot of enhancements were made in the BI space and in T-SQL itself and it’s always good to get a refresher on what’s out there to help us all with maintaining and administering our databases.

Back To Basics: Clustered vs NonClustered Indexes; what’s the difference? Denny Cherry put together a great refresher on the differences between the two types of index available in SQL Server. This is a solid overview that you should keep around for anyone who asks you that question.

Development

Arguments against using an ORM layer – an ammunition stockpile Corey Trager tackles a very touchy argument: to ORM or not ORM. I’ve been on both sides of the fence during my career, and it’s a very difficult argument to make from either side. Luckily, Corey’s own argument is both bolstered and refuted by the comments on his blog. It’s a great read no matter what your opinion is.

General

Technical Presentations: Be Prepared for Absolute Chaos When you’re giving a presentation, you need to be prepared for whatever can go wrong. Scott Hanselman relates a recent experience of his presenting at TeachReady8 (an internal Microsoft conference), and talks about the problems he ran into giving a presentation.

Learning from failure is overrated Jason Fried at 37signals goes off a little bit about the value placed on failure these days. His thoughts are some great things to keep in mind when you’re reviewing past failures and successes.

Online Credibility, How to build it and how to lose it in an instant We all value our online presence. Well, I hope you do. Johnathan Kehayias provides some solid advice on how to monitor your own behavior in forums, both for the sake of your own credibility, but also so that we’re all able to contribute and help each other while providing solid, valuable, advice.

In Re: A default architecture – without stored Procedures

I wrote this today in response to Patrik Löwendahl’s post: A default architecture – without stored Procedures . I figured that I might as well post it up here, too, so that people can comment on my rantings.

You’d be hard pressed to get me to concede that a data access scheme based on stored procedures is a bad idea. In the absence of that, buy a good ORM tool.

I’m going to take a stab and responding to each of your points because I think it’s good to get a dialog going on these kinds of things.

Pain Point the First: Version Management
I’m not sure what you mean by this. It’s fairly trivial to version control SQL files, however I’m guessing that you aren’t having any difficulty with adding plain text files to version control. So, I’m going to make a guess and say that this has something to do with quickly determining which version of development code is in the database. This can be somewhat tricky. It becomes necessary to use some kind of build/migration system similar to the one used in Ruby on Rails where you have a schema_info table with a version column that is incremented or decremented by an automated tool. At this point, a high level of DBA discipline is required to prevent changes from being made in production that are not present in source control and vice versa.

Pain Point the Second: Two Code Bases
Again, I’m not sure what is meant by this. You’re absolutely right, maintaining stored procedures does require two skill sets. In an ideal world there would be a distinct separation of skills between database and application developers. While basic to intermediate SQL can be handled by most developers, venturing into high performance/advanced SQL requires a distinct skill set that requires years of experience and conscious practice.

Pain Point the Third: Business Logic
I’ve said this a large number of times over my career as both an application developer and database developer: business logic does not belong in the database. By default T-SQL will compile and store the first execution path it encounters in a stored proc. Let’s say you have a procedure with two branches, one performs a simple insert and the other performs a complex operation that can benefit from a compiled execution plan. If the simple insert statement branch is the first branch to be executed, it will be compiled and the other, complex, path with be performed as an ad hoc query unless the stored proc is ALTERed and the second execution path is executed. If you need complex logic and you have SQL 2005, you can use CLR stored procedures.

Pain Point the Fourth: Testing Stored Procedures
While I’m definitely not an expert on the subject of testing stored procedures, I have read numerous resources on the subject of testing stored procedures. Adam Machanic devotes some time to it in Expert SQL Server Programming. I’ll leave this topic to the experts.

Pain Point the Fifth: Writing Trivial SQL
When you’re operating, or attempting to operate, a high performance, high access database, there are no trivial data access calls. Granted, for the majority of data access scenarios (SELECT a, b, c FROM xyz WHERE param = @param), a stored procedure could be described as development overhead. This is a point that we agree on. However, I see the point that many DBAs have – if they want to change the underlying schema for performance/storage considerations, they should be able to do so as long as they provide developers with the same set of outputs as before, given the same inputs of course.

Pain Point the Sixth: Stored Procedures aren’t Dynamic
Nor should they ever be. The point of a stored procedure is to access data as efficiently as possible. In the event that you have multiple potential join combinations, you will need multiple stored procedures. Not to sound insulting, but the demand for multiple potential join combinations sounds like, to me, a poorly planned data access scenario. There are ways to dynamically generate WHERE and ORDER BY clauses using T-SQL and parameterized procedures/queries.

Ultimately, of course, the purpose of a stored procedure is to increase performance, granularity of data access, and provide an additional layer of security on top of the data. A data access solution based around stored procedures isn’t for every one, but it’s always good to know the reasons why you should be using it.

This site is protected with Urban Giraffe's plugin 'HTML Purified' and Edward Z. Yang's Powered by HTML Purifier. 531 items have been purified.