What’s your access level?

Too often developers and administrators alike find an easy solution to providing data access is to give users more permissions than are truly required. In the development environments I’ve worked in, developers always worked with full local and server admin rights and full database rights. This usually resulted in deployment issues with the expected response of “well, it worked fine for me.”

Your key to data

Your key to data access

Whenever possible now, I work under the principle of least privilege, giving users the minimum amount of permissions needed to complete their tasks. This extends right to the service account that runs SQL Server – it’s a Domain User account with no elevated rights on the server beyond what the SQL Server installation process conveys.

My first task after completing a new SQL Server 2000 installation would always be to remove the BUILTIN\Administrators group in SQL that gave anyone with membership in the local Administrators group full access to the SQL Server. Why in the world was it assumed that server and server application administrators were the same people? I was so pleased to not see that group in SQL Server 2008.

I found application roles in SQL Server 2000 to be frustrating. I liked the concept of granting permissions to a single role and then have users access the application through the role, but I lost all ability to keep an audit trail. The next best option was groups through Windows Authentication, but it enabled anyone with access needed for an application  to also get to the database through other means.

The improvement to SQL Server is loginless users. Users authenticate to the server instance with their own credentials, but by accessing the database, the user then impersonates the loginless user for permissions via EXECUTE AS. The server can still audit activity because the user authenticated with their own credentials, but do not have direct access to the database with those credentials.

To create a loginless user:

CREATE USER AppUser WITHOUT LOGIN
GO

To grant a group or user permission to impersonate the loginless user:

GRANT IMPERSONATE ON USER::AppUser TO [MyDomain\LeeAnne];
GO

To impersonate the loginless user and gain all permissions granted to it:

EXECUTE AS USER = 'AppUser'
GO

This feature will be invaluable with new development. It is my hope when I eventually migrate databases from SQL Server 2000 to 2008 that I can find someway of implementing loginless users for current Microsoft Access-based applications that currently use linked tables for database access.

The psychology of free

Quest freebie

Quest freebie

I attended a lovely lunch demo of Foglight by Quest Software earlier this week. As I was leaving and handing in my evaluation, I received a branded USB flash drive, the current hot freebie item. My first thought – “how big is it?” It was soon revealed to be 512MB. Only 512 MB?? Really?? I had to kick myself at that point for even thinking that.

I recall being thrilled not that many years ago to get my hands on a 32MB flash drive. Now that was amazing. I like getting free stuff. Do I use the free stuff I get? Rarely. I’m accumulating a fair collection of flash drives and I’ve actually used one of them recently, but it was a 1GB drive from VMWare. Clocks, compasses, carabiners, stress balls, all collect dust as we trot from one vendor demo to another.

Getting something for free appeals to our sense of value. But does it influence your decision-making the way the vendors hope it does? Probably more than we’re conscious of.

What kind of gadgets do you like to get for free?

Where in the world is SQL Server 2008?

Alberta's NTS areas (courtesy Alberta Geological Survey)

Alberta's NTS areas (courtesy Alberta Geological Survey)

We are only just beginning our foray into the world of SQL Server 2008. It has some fantastic new capabilities like spatial datatypes and methods.

The problem:
We publish datasets and documents that are specific to certain geographic areas of Alberta. The public comes to our website and wants to find all information that pertains to a town, city or area. We currently use Oracle Spatial to calculate where our publications are geographically, but with the move off of Oracle, we need a new method.


Disclosure:
Prior to working on this problem, I had no prior experience working with spatial data. It’s very likely that my methodology for resolving this issue is not the best one, but it works for me! for now.


Alberta is divided into areas based on the [pdf] National Topographic System (NTS). Most of our publications have four geographic coordinates in degrees that define its study area. These coordinates can be used to calculate which publications intersect or overlap NTS areas.

I started by building a table of NTS areas.

CREATE TABLE NTS_Area (Sheet, Bounding_Poly geography);
INSERT INTO NTS_Area (Sheet, Bounding_Poly) VALUES ('72e',
     geography::STGeomFromText('POLYGON((-110 49, -110 50, -112 50,
    -112 49, -110 49))', 4269));
INSERT INTO NTS_Area (Sheet, Bounding_Poly) VALUES ('72l',
     geography::STGeomFromText('POLYGON((-110 50, -110 51, -112 51,
    -112 50, -110 50))', 4269));

The method STGeomFromText() returns an instance of the type geography based on the the first parameter, which can be one of Point, MultiPoint, LineString, MultiLineString, Polygon, Multipolygon, or Geometry Collection. A polygon requires a minimum of three distinct points; in this case, I am providing four distinct points to create a four-sided polygon, with the fifth point being the same as the first point in order to close the polygon.

The second parameter, 4269, is the spatial reference ID and refers to the version of NAD83 that is our organizational standard. Other supported spatial references can be found in the system catalog sys.spatial_reference_systems.

My source database is still on SQL Server 2000, so I pull the four coordinates along with their key to a table on my 2008 server for processing every night. Right now it’s just over 3500 records after eliminating those records that may not contain four coordinates, have invalid coordinates, or are outside of Alberta.

CREATE TABLE [dbo].[BOUNDS](
     [ID] [decimal] (13, 0) NOT NULL,
     [WEST_COORDINATE] [decimal] (18, 12) NOT NULL,
     [EAST_COORDINATE] [decimal] (18, 12) NOT NULL,
     [NORTH_COORDINATE] [decimal] (18, 12) NOT NULL,
     [SOUTH_COORDINATE] [decimal] (18, 12) NOT NULL,
     [ID_GEO] [geography] NULL,
PRIMARY KEY CLUSTERED

I set up a trigger to create the geography object representing the polygon upon insert into the table.

UPDATE [dbo].[BOUNDS]
SET [ID_GEO] = geography::STGeomFromText('POLYGON ((' +
  CAST([EAST__COORDINATE] AS VARCHAR) + ' ' +
  CAST([SOUTH_COORDINATE] AS VARCHAR) + ', ' +
  CAST([EAST_COORDINATE] AS VARCHAR) + ' ' +
  CAST([NORTH_COORDINATE] AS VARCHAR) + ', ' +
  CAST([WEST_COORDINATE] AS VARCHAR) + ' ' +
  CAST([NORTH_COORDINATE] AS VARCHAR) + ', ' +
  CAST([WEST_COORDINATE] AS VARCHAR) + ' ' +
  CAST([SOUTH_COORDINATE] AS VARCHAR) + ', ' +
  CAST([EAST_COORDINATE] AS VARCHAR) + ' ' +
  CAST([SOUTH_COORDINATE] AS VARCHAR) + '))', 4269)
WHERE ID IN
  (SELECT ID FROM BOUNDS)
Publication polygon intersection with NTS area

Publication polygon intersection with NTS area

The next task was to determine how the publication polygons intersected the NTS area polygons. To cut down on noise, we exclude any matches where the intersection of the two areas was less than 10 percent.

You can see here to the right an example of how a polygon of a publication intersects with multiple NTS areas. In this case, the study area is in the northeast corner of the province and includes part of Saskatchewan. We are interested in only NTS areas 74l and 74m and the calculations will ignore the part of the polygon to the right of the two marked NTS areas.

I created a view to perform this calculation.

SELECT bc.[ID], nts.[SHEET]
FROM [dbo].[BOUNDS] AS bc
CROSS JOIN [dbo].[NTS_AREA] AS nts
WHERE (bc.[ID_GEO].STIntersection(nts.[SHEET_GEO]).STArea() /
      nts.[SHEET_GEO].STArea() > .1)
OR (bc.[ID_GEO].STIntersection(nts.[SHEET_GEO]).STArea() /
      bc.[ID_GEO].STArea() > .1)

So what does this mean? I will use the NTS area 74l in the image to the left (numbers have been rounded to simplify the math).

NTS area 74l and intersecting publication area

NTS area 74l and intersecting publication area

bc.[ID_GEO].STIntersection(nts.[SHEET_GEO]).STArea() / nts.[SHEET_GEO].STArea() > .1

Is the area of the intersection between the publication polygon and the NTS polygon divided by the area of the NTS polygon greater than .1 (10%)?

The area of intersection (green colour) is 8.5 and the total area of the NTS area is 12.9, resulting in an intersection percentage of 66%. This satisfies the criteria for inclusion.

bc.[ID_GEO].STIntersection(nts.[SHEET_GEO]).STArea() / bc.[ID_GEO].STArea() > .1

Is the area of the intersection between the publication polygon and the NTS polygon divided by the area of the publication polygon greater than .1 (10%)?

The area of intersection (green colour) is 8.5 and the total area of the publication study area is 129.4, resulting in an intersection of 6%. Less than 10% of the study area intersects with this area of the province. This calculation is effective when you have a very small study area that may be entirely within a single NTS area. It could fail the first calculation of being less than 10% of an intersection with the entire study area, but it would be 100% of the study area and thus be found by a search of a specific NTS area.

The results are then used to update a table in the originating SQL Server 2000 database with 34,000 new records.

This process works, but not as well as I would like it to. The whole job takes almost an hour to run, compared to the 30 minutes on the Oracle server. I’m pursuing research on spatial indexes and effective use of regular expressions in T-SQL, but I welcome any suggestions on how to improve this process.

Discovering aliases and synonyms

Configuring a SQL alias

Configuring a SQL alias

My Oracle to SQL Server 2008 migration is going well and is close to wrapping up. In an effort to make the transition as seamless as possible for the users, I wondered what kind of options were available to me to  reuse the known name of the Oracle server, once it’s been taken offline. There has already been a DNS alias set up to obscure the actual name of the SQL Server, but the instance name still has to be added. Our SQL Servers are all named instances; can I refer to a named instance as DBPROD instead of PRDDB\PRD1?

A search for ‘sql 2008 alias’ returns over a million results – no real surprise. The most popular results are for configuring an alias on the workstation or web server using the SQL Server Configuration Manager. This looks promising, but it  requires a setup on individual machines. That seems like a lot more work than I want to do on a global scale, even in a small environment like ours. But definitely a possibility for some circumstances.

Then I started reading about synonyms. Synonyms were introduced with SQL Server 2005. They can’t be used to create an alias for a server, but they are for database-level objects. Synonyms have some important advantages:

  • Reference objects by a simple name rather than a fully qualified name. This is handy when referencing objects in other databases or on other servers that require multi-part names.
  • Avoid hard-coding names in case objects move. Instead of having to update every reference to an object, the synonym can be updated or replaced.
  • Facilitate backward compatibility when objects are renamed.

Creating a synonym is painless:

CREATE SYNONYM synonym_name FOR object; CREATE SYNONYM MyEmpTable FOR DBPROD.AdventureWorks.HumanResources.Employee; 

To use the synonym:

SELECT JobTitle FROM MyEmpTable; 

While neither of these options provided me with a perfect solution to my original enquiry, it was definitely useful to investigate both. I already know of one place where I intend to use a synonym. I found it interesting that there is a request on Microsoft Connect to expand the use of synonyms to both databases and linked servers. I look forward to seeing if that feature makes it into the next version of SQL Server!

24 Hours of PASS

I signed up for numerous sessions of 24 Hours of PASS not long after receiving notification of the event. It seemed like an ideal opportunity to preview some of the excellent speakers I was hoping to see at PASS in Seattle in November. I had high hopes about being able to stay awake for some of the overnight presentations!

PowerShell in Management Studio

PowerShell in Management Studio

My 24 hours started with Allen White’s talk on using PowerShell for managing SQL Server. I have been keenly interesting in learning more about PowerShell but I had no idea how it could be used with SQL Server. I had seen the link in Management Studio and probably tried it once or twice, but without knowing what to do with it, had never bothered with it again.

Starting PowerShell from the Windows command line does not give you the same features as starting PowerShell from within SQL Server Management Studio.  The new features are found in snap-ins which contain SQL-specific cmdlets and an object explorer for navigating the SQL hierarchy. Fortunately the snap-ins can be added from the standard PowerShell command line to enable the features outside of Management Studio.

I also learned the common Windows commands like ‘dir’ and ‘cd’ are not actual PowerShell commands, but are aliases for Get-ChildItem and set-Location respectively.

I was really looking forward to Greg Low’s Working with Spatial Data session as I am currently struggling with adapting a process that’s been running on Oracle Spatial for years to SQL Server. Unfortunately, the session happened at 3am; I woke up in time for it, but didn’t manage to stay awake for more than 15 minutes. Top of my list for viewing the recordings!

I tuned into Grant Fritchey’s Query Performance Tuning 101 at a more reasonable hour in the morning. The different methods of capturing data about performance will certainly come in handy in my work.

I regret now that I didn’t register in advance for Kalen Delaney’s What’s Simple about Simple Recovery Model presentation. Another recording to be reviewed carefully.

I am certainly looking forward to when PASS posts the recorded sessions online so that I can watch the ones that I missed and review the ones that I did watch. Bravo for providing this event to the SQL community! I hope this will become an annual event.

Disappointment but hope

A few months ago I was thrilled to get approval from my boss to attend SQL PASS Summit in Seattle in November. It would have been my first opportunity to attend such a large and prestigious event, and I was so looking forward to the knowledge boost and renewed enthusiam it would have given me. In recent months I’ve been finding and following more SQL experts on Twitter and reading more blogs, and I couldn’t believe that I’d actually have the opportunity to meet some of my new heros.

Then along comes a recession to spoil my fun. I just found out this week that I would have to cancel my registration to PASS. Was I unhappy.  Fortunately, SQLServerPedia is promoting a contest to win a trip to PASS for first-timers. One of the ways to win is by blogging, so this was finally the incentive I needed to get a start on a blog. Do I think I have much chance of winning? Not really, but it beats not even making an attempt.

Otherwise, I may have to consider finding the money to go on my own (with flights via Aeroplan points). I know how important it is to invest in training to further a career. Had I known that I might be fishing the money out of my own pocket, I may have been more frugal on my summer vacation. So I may have to scale down my expections and just go for the three main days and leave the pre- and post-workshops for another time. I will also make best use of the suggestions for getting to PASS on the cheap!

Steve Jones made a presentation during 24 Hours of PASS on building a better blog, so I now have some kind of plan for what I will do with this blog. I’m learning a lot about SQL Server 2008 right now, as I’m in the middle of a migration from Oracle, so I should be able to document some of the new things I discover along the way.

Thanks for reading! Don’t be surprised if the look of the blog changes frequently for the first while, as I try to settle on a theme that I like.

So how are we related, anyways?

Vésina descendents

Vésina descendents

Curious as to how we might be related to Jacques Vésina? From the descendant tree on the right, you can see that we are related through Emilda Nault.

So, Jacques, Marie and family have arrived in Nouvelle France and purchased some land around Chutes Montmorency in 1659.

According to documents, Louise was entered into a marriage contract with Charles Garnier on December 21, 1664, when she was 12 years old. The dowry was 200 livres, dishes and cutlery. The actual marriage was delayed because of her young age, but it not known exactly when it took place.  Charles was from Tournebu in Normandy and was 28 years old.

We know Charles was in Nouvelle France because he witnessed a land deal on December 5, 1661. He was a farmer, and on January 1, 1664, he bought 2 arpents of land not far from Jacques’ property.

Louise went on to have 11 children. Five died before reaching the age of 21. Her 16-year-old son François died three weeks after her son Pierre, who was 20, both probably by drowning, possibly while fishing, in June 1700. Four daughters, including our ancestor Marie-Charlotte, and two sons survived to adulthood.

Louise died on December 1, 1714 around the age of 62. Charles was buried at Beauport on February 6, 1717 at the age of 81.

Software upgrade

I seem to have survived my first manual upgrade of WordPress. Not overly difficult, if you discount the problems with uploading via ftp. Hopefully no problems as a result! This version is supposed to have some great new features like a CMS – time to explore!

I was thrilled to see that English ads finally showed up. It took a lot of convincing for Google to stop sending French ones!

Jacques Vésina, the early years

(The following details of Jacques Vésinat’s life are from the biographical essay written by Gérard Vézina, and I’m immensely grateful to GHS for spending a weekend reading and translating from the original French.)

According to Abbot J. B. Antoine Ferland, Jacques Vésina was a native of Puyravault, in the old French province of Aunis, now known as Charente-Maritime, near Rochefort, in the canton of Surgères, not far from the well-known city of La Rochelle. Although Aunis was the smallest province in France, it provided the largest number of colonists to Nouvelle-France (Québec).

The origin of the Vésina name is uncertain. It is thought by some to derive from “vésine,” which is a southwest wind referred to in certain regions in the Rhône. This wind was possibly followed by a downpour that’s called a “vézinée.” The family name Vézinat or Vésinat was common in Aunis in the 17th century.

Jacques was born around 1610 and although he was born in Puyravault, he spent many years in La Rochelle. His wife Marie Boisdon was born around 1617 and they were married June 10, 1640 in Puyravault. They initially lived in Saint-Rogatien, Marie’s birthplace. Their first child, François, was born in 1642, followed by Marie in 1649. It’s possible there were other children born during those seven years for which records do not exist.

During one census, Jacques listed his occupation as a cooper (barrelmaker). He later claimed to be a merchant, of what variety is unknown, but most likely related to his previous work as a cooper.

By 1659 when Jacques and his family left for Nouvelle-France, they had six children, including a second son also called François. Records of the time show that it was common for the godfather to ask that the godson be named after him, especially if he was important in society. The second François’ godfather was François Clément of the 100 Swiss guards to King Louis XIV.

Their reason for emigrating is not recorded but some educated guesses can be made. In the seventeenth century, France was involved in numerous wars with England, Spain, the Netherlands and others, as well as experiencing violence between Catholics and Protestants. In 1627, Richelieu besieged La Rochelle, which was considered a Protestant bastion. The town resisted for more than a year before surrendering, after 15,000 of its 20,000 residents were killed. Jacques was 17 years old when this happened.

In 1648, people were starving and unable to pay the royal taxes and soldiers pillaged towns that didn’t contribute their share. And it’s likely that there was a heavy propaganda campaign to bolster the population of Nouvelle-France.

Of the six children, the records indicate that only five children arrived with their parents in North America. The youngest, Jeanne, was only a few months old when they departed. Did she die during the trip? There are no passenger records to confirm this. Based on various other records, it can be deduced that the Vésina family was on the ship called Saint-André, and that Jacques was a “free passenger”, meaning that he was not indebted in servitude to another.

It is presumed that the family arrived in the city of Québec on September 7, 1659, and Jacques bought land on January 11, 1660. It was two kilometres from the Chute Montmorency area on the Beaupré coast at Longue Point and he paid 120 livres.

More about their life in Canada to come!

A famous hockey player in our family tree?

Well, only very distantly.

My mom (Marthe) and I (Lee Anne) were recently in Québec City on vacation along with Marie and Derwyn Wilson. While visiting Montmorency Falls just outside the city, Marie wanted to stop at ‘Maison Vézina’ to see if it was anything to do with the hockey player after which the Vezina Trophy is named. And yes, it does form part of the history of the family from which Georges Vézina is descended.

This property and the existing house were purchased in 1666 by François the Elder (there were two sons named François), oldest son of Jacques Vésina, and remained in the Vézina family for over 300 years. In the 70s it was sold to a real estate developer who let it deteriorate. A few years ago, it was purchased by a historical society who restored it back to the near-original state.

In the house, you can view various archeological items that were found around the house during the restoration, as well as descriptions of the Vézina family life.

Fortunately I had brought with me my family tree as researched by La Société Historique de Saint-Boniface of Winnipeg in 2006, and discovered that we are also descended from Jacques Vésina, by way of his third daughter Louise.

I had the opportunity to purchase a “biography” of Jacques Vésinat (you’ll note the various name spellings) while visiting the house, and once translated, I will share interesting bits about how he and his family came to emigrate to Canada in the 1600s.