Sunday, July 18, 2010
Using the SQL Azure Migration Wizard v3.3.3 with the AdventureWorksLT2008R2 Sample Database
image George Huey’s SQL Azure Migration Wizard (SQLAzureMW) offers a streamlined alternative to the SQL Server Management Studio (SSMS) Script Wizard for generating SQL Azure schemas that conform to the service’s current Data Definition Language (DDL) limitations. You can download the binaries, source code, or both from CodePlex; a Microsoft Public License (Ms-PL) governs use of the software. Detailed documentation in *.docx format is available. The recent versions use SQL Server’s Bulk Copy Protocol (BCP) utility to populate SQL Azure tables with data from an on-premises database.
• Update 7/18/2010 9:00AM: George Huey updated SQLAzureMW from v.3.3.2 to v.3.3.3 on 7/17/2010 with 5,227 downloads of the new version on its first day. The last documentation release on 7/6/2010 covers v.3.3. Following are the changes to versions v3.1.4 and later (the previous version of this post covered v.3.1.3):
v3.3
- Added DateTime stamps on upload processing
- Added Support [for] 50 GB SQL Azure databases
- Removed checks for HierarchyID and geography data types
- Refined check for textprt function, not just name
- Created SQLAzureMWUtils which is used by SQLAzureMW and SQLAzureMWBatch
- Added retry on 10054 SQL Azure connection errors
v3.2.2
- Added functionality to allow the user to specify where the BCP output files should be stored. The following items have been added to SQLAzureMW.exe.config file:
<add key="BCPFileDir" value="c:\SQLAzureMW\BCPData" /> <!-- Input / Output file location for BCP table data. If blank, then will use temp directory -->
<add key="DelOldBCPFiles" value="true" /> <!-- Deletes old BCP files if exists, otherwise, it will add a unique numeric ext -->
<add key="BCPFileExt" value="dat"/> <!-- BCP output file extension –>v3.2.1
- Added support for User Defined Table Types
v3.2
- Added validation of SQL Azure user name ext @SERVER to make sure that it matches server name SERVER.
v3.1.9
- Fixed code to not throw an exception when trying to generate script on an encrypted stored procedure. A red error message will displayed in the status page.
- Modified code to move non clustered index creation until after data upload
- Put in check to make sure that @Server (where @Server is the SQL Azure Server) has been entered during login process.
v3.1.8
- On the DROP objects, put in a tempor[ar]y fix to reverse sort order the object. What this means is that as long as you have all of the dependent tables selected [for] the main table you want to drop you will be ok. If you want to drop one table that has dependent tables on it, then you will get errors. A fix is in process.
v3.1.7
- Fixed sorting routine to ignore case when comparing object names
v3.1.6
- Added functionality to create DROP scripts. You can set this option in SQLAzureMW.exe.config or dynamically during runtime.
v3.1.5
- Removed search for dbo.sysobjects from tables in NotSupportedByAzureFile.config
- Fixed script engine to handle a single TSQL file with no "GO" statements in it
v3.1.4
- Modified SQLAzureMW to put "GO" between Tables / Views and their indexes. This is needed when parsing SQLAzureMW TSQL output from a file. Without the "GO", then the indexes were being parsed with Table / View rules thus causeing syntax errors on generated TSQL.
Most, if not all, workarounds in the step-by-step instructions below are unnecessary with v.3.3.3.
• Update 1/23/2010 4:00 PM PST: George Huey released v3.1.4 of his SQL Azure Migration Wizard (SQLAzureMW.exe) to CodePlex on 1/23/2010 with a fix for problems analyzing (parsing) T-SQL scripts from a file. If you don’t want to go through the entire process of analyzing and creating a T-SQL script from the on-premises AdventureWorksLT2008R2 database, I recommend using files from AdventureWorksLT.zip.
Update 1/22/2010 6:00 PM PST: George Huey has provided an AdventureWorksLT.zip archive with an updated T-SQL script and a set of BCP data files that run without errors in v3.1.3. To download and run these files, see the Temporary Workaround for v3.1.3 with a Modified AdventureWorksLT.sql File section at the end of this post.
Update 1/21/2010: Added workarounds for unsupported features (XML indexes and three-part column names) not handled by the Wizard’s NotSupportedByAzure.Config file and a bug related to importing TSQL scripts for execution, which will be fixed in future v3.1.4. See the Dealing with Errors Resulting from Tables Containing xml Columns/Indexes and Three-Part Names section near the end of this lengthy post.
This post is an update of my Using the SQL Azure Migration Wizard with the AdventureWorksLT2008 Sample Database post last updated on 9/21/2009 for George’s v0.2.7 release and covers new requirements and features in v3.1.3 from the SQLAzureMW docx v2.0 documentation file:
Requirements: SQLAzureMW requires SQL Server 2008 R2 bits to run.
Project Details: The SQL Azure Migration Wizard (SQLAzureMW) gives you the options to analyzes, generates scripts, and migrate data (via BCP) from:
It will also analyze SQL Profiler trace files and TSQL script for compatibility issues with SQL Azure
- SQL Server to SQL Azure
- SQL Azure to SQL Server
- SQL Azure to SQL Azure
I updated in mid-January 2010 the sections about SQLAzureMW (originally MigWiz) of Chapter 13, “Exploiting SQL Azure Database's Relational Features,” of my Cloud Computing with the Windows Azure Platform book for Wrox/Wiley to point to this post. SQLAzureMW is a moving target, as you can see from the workarounds at the end of this long article, so this post will be updated as George upgrades his Migration Wizard.
This post’s first section demonstrates the most common workflow for duplicating an on-premises SQL Server 2008 R2 database (AdventureWorksLT2008R2) in SQL Azure running in a Microsoft data center.
Note: This post is a preliminary attempt to find workarounds to problems observed in testing SQLAzureMW.exe v3.1.3 with the AdventureWorksLT2008R2 sample database when running on a VM with Windows Server 2008 R2 (no Hyper-V) as the guest OS. The host OS is Windows Server 2008 R2 Hyper-V.
Installing the AdventureWorksLT2008R2 November 2009 CTP Database
1. Download the source database as AdventureWorks2008R2_NovemberCTP.exe from the AdventureWorks 2008R2 November CTP release of 11/9/2009 on CodePlex.
2. Run AdventureWorks2008R2_NovemberCTP.exe and accept the default script destination folder: C:\Program Files\Microsoft SQL Server100円\Tools\Samples\.
3. Optionally, skip the Database Selection dialog’s Install operation by clicking Cancel and performing the following steps; otherwise click Install and you’re done.
4. Open C:\Program Files\Microsoft SQL Server100円\Tools\Samples\AdventureWorks2008R2LT\instawltdb.sql in SQL Server Management Studio 2008 R2 [Express] and connect to your SQL Server 2008 RS [Express] instance’s master database.
5. Choose Query | SQLCMD Mode to execute the query in SQLCMD mode.
6. Uncomment the two :setvar statements highlighted in the following screen capture by removing the -- prefix to set the script and database file paths:
7. Execute the query and verify that all tables are present and contain data.
Generating an SQL Azure Schema from an On-Premises SQL Server Database
1. Go to CodePlex’s Release page for SQL Azure Migration Wizard v3.1.3 (Beta) of 1/15/2010 and download SQLAzureMW v3.1.3 Release Binary (SQLAzure v3.1.3 Release Binary.zip).
2. Extract the zip files to a local folder, C:\Program Files\SQLAzureMW for this example.
3. Run SQLAzureMW.exe to open the ScriptWizard’s Select Process page.
4. Mark the Analyze and Migrate – SQL Database option, which enables migrating schemas and data to or from SQL Azure databases:
5. Click Next to open the Select Source page with the Connect To Server dialog active.
6. Select the on-premises SQL Server 2008 R2 server instance that contains the AdventureWorksLT2008R2 database from the Server Name list (the WINSRV2008SP2VM virtual machine for this example), accept the use Windows NT Integrated Security and 15 second Connection (timeout) defaults:
7. Click Connect to connect to the instance and open a list of available databases on the Select Source page, and select the source database (AdventureWorksLT2008R2 for this example):
8. Click Next to open the Choose Objects page, mark the Select Specific Database Objects, and mark all object type check boxes except XML Schema Collections, which SQL Azure doesn’t suport:
Optionally, click Advanced to display the Advanced Settings properties list.
9. Click Next to open the Script Wizard Summary page and expand the nodes to display individual database objects:
10. Click Next and click Yes in the Generate Script message box to open the end of the Result Summary Page:
Note: You can open BCP output files stored as tmp####.tmp in your \Users\UserName\AppData\Local\Temp1円 folder in NotePad or a binary file editor, but native BCP files consist of length followed by data. Data in native BCP files is encoded as Unicode (UTF-16).
Click Save to save the Result Summary as the default .rtf file type in the \Users\UserName\Documents folder.
11. Click the SQL Script tab to display the end of the generated T-SQL script:
Click Save to save the Script as the default .rtf file type (AdventureWorksLT2008R2Script.rtf for this example) in the \Users\UserName\Documents folder.
12. Click Next to open the Setup Target Server Connection with the Connect To Server dialog active with default SQL Azure Server Name, SQL Server security, User Name and Connection (timeout) defaults.
13. Replace SERVER with a copy of your SQL Azure server name in two places and User Name with your administrative user ID, and type your password:
Note: You can avoid the need to retype the Server Name and your administrator credentials by replacing values of the following elements of the SQLAzureMW.exe.config file’s <appSettings> group:
<add key="TargetServerName" value="SERVER.database.windows.net"/>
<add key="TargetUserName" value="UserName@SERVER"/>
<add key="TargetPassword" value="YourPassword (optional)"/>
14. Click Connect and Create Database to open the Create Database dialog. Type the name of the SQL Azure database (AdventurWorksLT2008AZ for this example):
[画像:CreateTargetDatabase593px]
15. Click Create Database to add the new database to the list of the server’s databases, click Next to open the Execute Script message box and click Yes to execute the script against the SQL Azure database and display the Target Server Response page:
Click Save to save the Target Server Response as the default .rtf file type in the \Users\UserName\Documents folder.
Note: You can download from SkyDrive SQLAzureMW RTF Files.zip, which contains the three .rtf fields shown above.
Errors Reported in the Target Server Response File
The Target Server Response file shows numerous red error messages, which require correction to enable migration of object types that SQL Azure supports. The most serious of these errors occur with the SalesLT.ProductModel table, which contains a ProductDescription column of the xml datatype. SQL Azure supports the xml datatype, but not xml indexes, so creation of the table fails. The missing table cascades several other errors.
Error #: 40512 -- Deprecated feature 'More than two-part column name' is not supported in this version of SQL Server errors appear twice in the Target Server Response .rtf file. These three-part column name errors prevents creation of the SalesLT.uSalesOrderHeader and SalesLT.iduSalesOrderDetail triggers.
Dealing with Errors Resulting from Tables Containing xml Columns/Indexes and Three-Part Names
When using the SQL Azure Migration Wizard v3.1.3 with the SQL Azure Database December 2009 Release to Web version, it appears the simplest approach is to delete the PXML_ProductModel_CatalogDescription primary XML index from the source table (see this post’s first screen capture). In a production environment, however, deleting a Primary XML index can have serious affect on the performance of xml operations.
You must edit the T-SQL in the to solve the three-part column name errors, so also removing the primary XML index generation statements is a practical alternative:
To edit the T-SQL Script file you created and saved in step 11 of the preceding process, do the following:
1. If you have an AdventureWorksLT2008AZ database, open the SQL Azure portal and delete it. (Although you can delete an existing database of the selected server in a Wizard dialog, you might encounter an unhandled exception.)
2. Launch SQLAzureMW.exe and complete steps 1 through 10 of the preceding section.
3. In the Results Summary dialog with the SQL Script tab selected, scroll to the
--~ Table [SalesLT].[ProductModel] -- CREATE PRIMARY XML INDEX is not supported in current version of SQL Azure
and delete the instructions shown selected here:
4. Scroll to the first instance of the
--~ Table [SalesLT].[SalesOrderHeader] -- Multi-part names ( [SalesLT].[SalesOrderHeader].[RevisionNumber) with Server or Database specifier is not supported in this version of SQL Azure.
comment and remove the three instances of the [SalesLT].[SalesOrderHeader]. prefix, to change the code for the uSalesOrderHead trigger to that shown here:
Also remove the red error line.
5. Scroll to the second instance of the Multi-part names error message and change the code for the iduSalesOrderDetail trigger to that shown here
Also remove the red error line.
6. Click Save and save the script as an .rtf file (AdventureWorksLT2008R2FixScript.rtf for this example) to your Documents folder.
7. Continue with steps 12 through 15 of the preceding section. The dialog now appears as shown here:
8. Open the AdventureWorksLT2008AZ database in SQL Server 2008 R2 SSMS [Express] and verify that the SalesOrderHeader and SalesOrderDetail tables have a trigger.
Thanks to George Huey for providing details of the required workarounds.
To Create a T-SQL ScriptName.sql File from a Saved ScriptName.rtf File:
1. Open \Users\UserName\Documents\AdventureWorksLT2008R2FixScript.rtf (for this example) in WordPad or Microsoft Word (the version isn’t important), and save a backup copy.
2. Save a text file (\Users\UserName\Documents\AdventureWorksLT2008R2FixScript.txt for this example) encoded as a Unicode Text File.
3. Change the file extension from .txt to .sql.
To Create a T-SQL ScriptName.sql File by Copying Text from the Text Box:
1. Make the corrections to the Results Summary text box shown in steps 3, 4 and 5 of the preceding section.
2. With the text box selected, press Ctrl+A and Ctrl+C to copy the contents to the Clipboard.
3. Open a new query in SQL Server 2008 R2 SSMS [Express].
4. Press Ctrl+C to copy the text to the query.
5. Press Ctrl+F5, choose Query | Parse, or click the check mark in the tool bar to parse the query.
6. Save the script (as \Users\UserName\Documents\AdventureWorksLT2008R2FixScript.sql for this example.)
Download Demo Script Files for Testing:
The AdventureWorksLT2008R2FixScript.sql and AdventureWorksLT2008R2FixScriptNoBCP.sql files are available for download and inspection in AdventureWorksLT2008R2FixScript.zip from my SkyDrive folder.
AdventureWorksLT2008R2FixScriptNoBCP.sql omits the
-- BCPArgs:1:.dbo.BuildVersion" in "C:\Users\Administrator\AppData\Local\Temp1円\tmpEB01.tmp" -E -q -n -b 5000
instructions for adding data from temporary BCP native data files that are machine and instance specific. Errors reported as the result of temporary BCP files are non-fatal but time-consuming.
To Use the Saved Script in v3.1.3 with a Temporary Workaround:
1. Launch SQLAzureMW.exe, select the Run TSQL without Analyzing – TSQL File (削除) Analyze and Migrate – TSQL File (削除ここまで) option, browse to and select the AdventureWorksLT2008R2FixScript[NoBCP].sql file for this example:
2. Click Next twice, connect to your SQL Azure server instance. If the existing AdventureWorksLT2008AZ database contains objects, delete and recreate it.
3. Click next to process the script.
Note: I investigated a problem importing a script file with the Analyze and Migrate – TSQL option, which causes many errors when analyzed, with George Huey. He reports that a bug in v3.1.3 removes some TSQL commands from the file that should be present. Until George releases v3.1.4 with the problem corrected, use the Run TSQL Without Analyzing option.
To Use Saved Scripts After the Bug Fix in v3.1.4:
1. Launch SQLAzureMW.exe, select the Analyze and Migrate – TSQL File option, browse to and select the AdventureWorksLT2008R2FixScript.sql file for this example:
Note: The script file with temporary BCP options included is shown above, but using the “NoBCP” version is recommended.
2. Click Next twice, connect to your SQL Azure server instance. If the existing AdventureWorksLT2008AZ database contains objects, delete and recreate it.
6. Click next to process the script.
Temporary Workaround for v3.1.3 with a Modified AdventureWorksLT.sql File
George Huey has provided an AdventureWorksLT.zip archive with an updated T-SQL script and a set of BCP data files that run without errors in v3.1.3 and v3.1.4. To use these files with v3.1.3 or v3.1.4, do the following:
1. Create a C:\Temp folder if you don’t already have one.
2. Download the AdventureWorksLT.zip archive from SkyDrive, saving it in the C:\Temp folder.
3. Extract all files to the default subfolder: C:\Temp\AdventureWorksLT. The fixed path is required to process the BCP native data (*.DAT) files in the archive.
4. Follow the instructions in the preceding section, except substitute C:\Temp\AdventureWorks\AdventureWorks.sql for the path and file shown in the preceding screen capture.
5. Verify with SQL Server 2008 R2 SSMS that AdventureWorksLT2008AZ SQL Azure database contains the same objects, including the two database triggers, and data as the original AdventureWorksLT2008R2 on-premises database.
Posted by Roger Jennings (--rj) at 9:09 AM View Comments
Labels: Azure, Cloud Computing, SQL Azure, SQL Azure Data Migration, SQL Azure Database, SQL Azure Migration Wizard, T-SQL, Windows Azure
Monday, September 21, 2009
Using the SQL Azure Migration Wizard with the AdventureWorksLT2008 Sample Database
George Huey’s SQL Azure Migration Wizard (MigWiz) offers a streamlined alternative to the SQL Server Management Studio (SSMS) Script Wizard for generating SQL Azure schemas that conform to the service’s current Data Definition Language (DDL) limitations. You can download the binaries, source code, or both from CodePlex; a Microsoft Public License (Ms-PL) governs use of the software. You can learn more about MigWiz and watch a screencast from Wade Wegner’s SQL Azure Migration Wizard post of 9/1/2009.
•• Update 9/21/2009: There were still a few problems remaining with processing the raw AdventureWorksLT2008.sql script from SSMS that George’s v0.2.7 release solves:
v 0.2.6
- Modified code to parse index when loading from file
- Modified NotSupportedByAzureFile.config to correct Regex expression for index options.
You must manually add a clustered index primary key constraint to SSMS’s Script Wizard’s output for the AdventureWorksLT2008’s BuildVersion table as shown emphasized (bold) in the example below:
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[BuildVersion]') AND type in (N'U')) BEGIN CREATE TABLE [dbo].[BuildVersion]( [SystemInformationID] [tinyint] IDENTITY(1,1) NOT NULL, [Database Version] [nvarchar](25) NOT NULL, [VersionDate] [datetime] NOT NULL, [ModifiedDate] [datetime] NOT NULL, /******* Added 9/21/2009 RJ ******/ CONSTRAINT [PK_BuildVersion_SystemInformationID] PRIMARY KEY CLUSTERED ( [SystemInformationID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY]END GO
If you don’t add the constraint, you will encounter many errors when attempting to generate the SQL Azure database.
The Attempting to Run a Raw SSMS-Created Script That Populates Table Data section has been renamed Running an SSMS-Created Script That Populates Table Data.
•• Update 9/19/2009: George’s v0.2.6 release solves the problems reported in the Attempting to Run a Raw SSMS-Created Script That Populates Table Data section. George’s Release Notes say:
v 0.2.6
- Implemented suggestion from rogerj to give the user the option to parse a TSQL file for incompatibilities
- Fixed parsing for Multi-part names with Server or Database specifier
- Modified parsing for ROWGUIDCOL to leave uniqueidentifier
- Modified the way a file is loaded (1000x performance increase)
Note, that when parsing for “text” to replace with “varchar(max)”, if you have a comment in your code (i.e., Take user’s text and save it), then text will be changed to varchar(max). If this causes you lots of problems, take the search for text out of the TSQL section of NotSupportedByAzureFile.config.
• Update 9/8/2009: Added promised Attempting to Run a Raw SSMS-Created Script That Populates Table Data section.
I wanted to add a few sections about the MigWiz to Chapter 13, “Exploiting SQL Azure Database's Relational Features,” of my forthcoming Cloud Computing with the Windows Azure Platform book for Wrox/Wiley, so I gave it a test drive with the AdventureWorksLT2008 sample database over the Labor Day weekend.
This post’s first section demonstrates the most common workflow for duplicating an on-premises SQL Server 2008 schema (but not data) for the AdventureWorksLT2008 in SQL Azure running in a Microsoft data center. In the process, I detected and worked around a few problems with MigWiz v.0.2.3 of 9/2/2009.
The second section, Attempting to Run a Raw SSMS-Created Script That Populates Table Data, describes an unsuccessful attempt to process a script file generated by SSMS’s Script Wizard that includes INSERT statements for adding data.
Note: Reducing the size of screen captures doesn’t cause a significant reduction in page size or loading time.
Generating an SQL Azure Schema from an On-Premises SQL Server Database
1. Run SQLAzureMW.exe to open the ScriptWizard page and click Connect to open the Connect to SQL Server dialog. With the default SQL Server tab active, select the server instance in the Server Name list, accept the default Use Windows NT Integrated Security option and 15-second Connection timeout, and click Connect to connect to the instance and open the Select Source page.
2. Select the on-premises server’s database that you want to script in the Select Source page and click Next to open the Choose Script Options page.
3. Accept the default options in the Choose Script Options page and click Next to display the Select Object Types page.
4. Click Select All and then clear the User Defined Data Types and XML Schema Collections check boxes, because the current SQL Azure CTP doesn’t support these object types:
Click Next to open the Choose Schemas page.
5. AdventureWorksLT2008 has only a single SalesLT schema, so select it.
Click Next to open the Choose Stored Procedures page, click Select All, and click Next to open the Tables page.
6. Click Select All to choose all tables for scripting.
Click Next to open the User Defined Functions page, click Select All, click Next to open the Views page, click Select All to choose all Views, click Next to open the Output Option page, accept the default Script to Window / SQL Azure option, and click Next to open the Script Wizard Summary page.
7. Expand the Script Wizard Summary page’s nodes to review your selections, options, and objects.
8. Click Script to generate the script, process it with the regular expressions from the NotSupportedByAzureFile.Config file, and display the corrected script with annotations in the Results Summary page:
Problems with SeverityLevel=”2”, if any, appear in red.
9. Click Next to open the Setup SQL Azure Connection page to set up for executing the script on SQL Azure running in the DataCenter of your choice by pasting the server protocol:name from the connection string, and entering the User Name and Password. Click Test Connection to display the combo list and select the previously created but empty AdventureWorksLT2008 database in the list:
If you didn’t create the database, type its name in the combo list. Click Script to execute the script against the SQL Azure cloud instance and display the SQL Azure Response page.
10. You’ll see an error in the SQL Azure Response page of v.0.2.3 of 9/2/2009 associated with an attempt to create an XML index on the ProductModel.CatalogDescription column, which SQL Azure doesn’t support:
There’s only a single instance of this error, so it’s easier to remove the offending instruction from the script than to write the regex.
11. You’ll also see another error near the end of the response caused by not detecting a PAD_INDEX = OFF directive in the CREATE UNIQUE CLUSTERED INDEX statement for the vProductAndDescription view:
This is another instance where simply removing the offending directives solves the problem: SORT_IN_TEMPDB = OFF must also be removed. The better choice is to remove the entire statement.
After fixing the two preceding issues, your response turns all green.
Running an SSMS-Created Script That Populates Table Data
Manually editing T-SQL scripts generated by SSMS’s Script Wizard is tedious at best. MigWiz v0.2.6 and later support T-SQL scripts created by SSMS’s Script Wizard. Here are the SSMS Script Wizard options I set to create the 5-MB, 6,590-line Raw_AdventureWorks2008LT_Azure.sql file, which is also included in the AdventureWorks2008LT_Azure.zip file on SkyDrive:
Here’s a list of the objects generated by the script:
Executing the script on SQL Azure running in the Production Fabric involves these steps:
1. Add the missing clustered primary key index constraint to the BuildVersion table as described near the beginning of this post, and save Raw_AdventureWorks2008LT.sql.
2. Click the Text File tab of the opening MigWiz dialog, mark the Parse TSQL for SQL Azure Incompatibles check box, click the Browse button, navigate to the location where you saved Raw_AdventureWorks2008LT.sql, and open it:
Click Next and wait while MigWiz crunches the file.
3. The script displayed in the SQL Script pane includes comments indicating unsupported T-SQL constructs, which have been removed by the regex instruction in the NotSupportedByAzureFile.Config file. The Result Summary pane is empty.
Click Next to open the Setup SQL Azure Connection page.
4. Complete the SAL Azure Connection page:
Click Next and wait for processing to complete to open the SQL Azure Response page. (Processing stops after encountering a large number of errors).
5. Making the modification to the Raw_AdventureWorks2008LT_Azure.sql results in only a few non-fatal errors near the end of the script:
Posted by Roger Jennings (--rj) at 10:28 AM View Comments
Labels: Azure, Cloud Computing, SQL Azure, SQL Azure Database, SQL Azure Migration Wizard, T-SQL, Windows Azure
Monday, June 16, 2008
LINQ and Entity Framework Posts for 6/12/2008+
Note: This post is updated daily or more frequently, depending on the availability of new articles.
Updated 6/15/2006 and 6/16/2006.
Illustrated Walkthroughs for Creating Web Apps and Web Sites with ASP.NET Dynamic Data
ASP.NET Dynamic Data projects come in two flavors:
- Dynamic Web applications or file-based Web sites that use either the Entity Data Model or LINQ to SQL as their data model.
- Static (Custom) Web sites that you create with the Dynamic Data Wizard. These sites currently are limited to LINQ to SQL only as their data source.
My forthcoming “Scaffold Web Apps with ASP.NET Dynamic Data” article for Visual Studio Magazine includes walkthroughs for creating the preceding two project types with Northwind as the backing database.
Publishing limitations precluded step-by-step screen captures, so I created two posts to illustrate the process:
- Dynamic Web application: ASP.NET Dynamic Data: Generate a Dynamic Administrative Web App from an Entity Data Model in Less than Five Minutes
- Static Wizard-generated Web site: ASP.NET Dynamic Data: Scaffold a Custom Administrative Web Site with a LINQ to SQL Data Model in Less than Five Minutes
An obvious drawback to the Wizard-generated site is lack of DynamicFilter dropdowns for filtering on EntityReference (foreign key) values.
Added: 6/15/2008
David Ebbo Explains ASP.NET Dynamic Data’s Handling of Foreign Keys in Scaffolded Dynamic Controls
David discusses how ASP.NET Dynamic Data displays EntityReferences from foreign keys in his Understanding foreign keys and relationship columns in Dynamic Data post of June 13, 2008. The discussion applies to grids and details views with Entity Data Model and LINQ to SQL as data models.
As David mentions, foreign key values (usually integers) appear when you use LINQ to SQL as the data model. The Entity Data Model replaces foreign key with Navigation Properties (EntityRefs.)
One reason folks become confused about foreign-key handling is that Dynamic Data requires LINQ-enabled data sources, such as the LinqDataSource and EntityDataSource, which implies using object terminology (e.g., classes, types, associations, entities, entity sets, entity references, etc.). This usage conflicts with relational terminology (primary and foreign keys.) The help files and blog posts have a tendency to favor the relational (data) dialect.
Mike Taulty Tackles Implementing IUpdatable() for LINQ to SQL DataContexts Backing ADO.NET Data Services
Mike’s ADO.NET Data Services - IUpdatable on LINQ to SQL post of June 14, 2008 discusses four possibilities for creating an updatable LINQ to SQL data model for use with ADO.NET Data Services. His final choice (#4) requires the developer to derive from <T> manually, as is the case with ADO.NET Data Service’s Web services.
Note: The Data Programmability group’s failure to update LINQ to SQL with an IUpdatable implementation is one more nail in the product’s coffin. (I neglected to mention this issue in my Is the ADO.NET Team Abandoning LINQ to SQL? post of May 23, 2008.
asp.netPRO Magazine’s June Issue Carries an ASP.NET Dynamic Data Article That’s (削除) Ina (削除ここまで) Finally Accessible Online
The aspnetNOW nesletter for June 12, 2008 says the “ASP.NET team's own Marcin Dobosz and Scott Hunter provide an introduction to Dynamic Data,” and “the June issue of asp.netPRO is now available as a FREE PDF.“
(削除) The problem is that it isn’t available. The asp.netPRO folks forgot to renew their NetworkSolutions domain registration, which expired on June 8, 2008. (削除ここまで)
(削除) When (and if) they get their domain registration problems straightened out, I’ll add a new blurb. (削除ここまで)
“An Introduction to ASP.NET Dynamic Data: Rapidly Create Rich Data-Bound Sites” is now accessible at http://www.aspnetpro.com/PDF/Issues/aspJUNE2008.pdf.
Bart De Smet Posts First Episode of His LINQ to MSI Implementation
Bart says in his LINQ to MSI - Part 1 – Interop post of June 13, 2008:
So, in this post of this series we'll take a look at some very simple MSI interop, giving us a basic data provider for MSI databases, mirrored after the typical structure of .NET data providers like System.Data.SqlClient.
And then goes on to discuss:
- Interop signatures
- Respecting well-known patterns
- The connection
- The command
- The data reader
- Sample
He adds a postscript to the preceding post in LINQ to MSI - Part 1 bis - Interop with SafeHandles of June 14, 2008.
Mike Taulty Diagnoses DataSvcUtil’s Namespace Errors When Creating Web Service Proxies
Mike solve issues with “The 'Namespace' attribute is invalid - The value '' is invalid according to its datatype 'http://schemas.microsoft.com/ado/2006/04/’” errors in his ADO.NET Data Services and LINQ to SQL - Errors Generating Proxy Code with DataSvcUtil.exe post of April 13, 2008.
Matthieu Mezil Illustrates Use of AsEnumerable() That Mystifies Me
Matthieu’s AsEnumerable: not only to use unsupported methods post of June 13, 2008 discusses the differences in the results of a pair of complex LINQ to SQL queries whose primary distinction is use of AsEnumerable() to solve a problem obtaining Northwind Order instances with OrderDates in 1998 having French customers together with their Customer instances.
So I wrote the following relatively simple query that returned what I thought Matthieu was after:
from o in Orders where o.OrderDate.Value.Year >= 1998 && o.Customer.Country == "France" select new {Order = o, Customer = o.Customer};
The result of the above query in LINQPad and SQL Server Management Studio appear in my Matthieu Mezil Illustrates Use of AsEnumerable() That Mystifies Me post of June 14, 2008. The post also includes results of running Matthieu’s T-SQL batches which 1) return all 270 orders for 1998 or 2) are a LEFT OUTER JOIN of all 1998 orders with 23 French Customer records.
I’m sure I must be missing a point, but I don’t know what it is.
Update 6/15/2008 1600 PST: Adding AsEnumerable() causes immediate execution of the inner query which returns and OUTER JOIN resultset of 270 rows. Local (client-side) filtering reduces the number of rows to 23. Use of AsEnumerable() in this case unleashes bad mojo. See Matthieu’s comment and the second Update.
Eugenio Pace Posts a New SQL Server Data Services Sample: A Provider for BlogEngine.NET
Eugenio’s Another SSDS sample: BlogEngine.NET on SSDS post of June 11, 2008 describes his SQL Server Data Services (SSDS) provider’s source code for the BlogEngine.NET 1.3.1 ASP.NET blog project as follows:
- The SSDS based BlogProvider
- An SSDS based Membership & Role Providers for the web site
- Unit tests for all (with about 93% coverage for the provider)
- A simple tool for pre-loading the SSDS container with information that BlogEngine needs to start
He also says “The new SSDS BlogProvider also uses patterns & practices Unity application block to wire up dependencies.”
It’s related to the SSDS demo in Bill Gates’ Tech*Ed keynote and definitely worth a trial run.
Repeated from LINQ and Entity Framework Posts for 6/9/2008+ due to its importance.
Posted by Roger Jennings (--rj) at 1:44 PM 0 comments
Labels: ADO.NET Data Services, ASP.NET Dynamic Data, EDM, Entity Data Model, Entity Framework, Entity SQL, LINQ, LINQ to Entities, LINQ to MSI, LINQ to SQL, T-SQL
Saturday, June 14, 2008
Matthieu Mezil Illustrates Use of AsEnumerable() That Mystifies Me
Matthieu’s AsEnumerable: not only to use unsupported methods post of June 13, 2008 discusses the differences in the results of a pair of complex LINQ to (削除) SQL (削除ここまで) Entities queries whose primary difference is use of AsEnumerable() to solve a problem obtaining Order instances with OrderDates in 1998 having French customers together with their Customer instances.
Updated 6/15/2008: See tests with Entity Framework below.
I would have used a LINQ to SQL or LINQ to Entities query with the results shown here in LINQPad to retrieve the Order/Customer instances with LINQ to SQL:
The preceding query generated the following T-SQL batch:
SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry], [t2].[test], [t2].[CustomerID] AS [CustomerID2], [t2].[CompanyName], [t2].[ContactName], [t2].[ContactTitle], [t2].[Address], [t2].[City], [t2].[Region], [t2].[PostalCode], [t2].[Country], [t2].[Phone], [t2].[Fax]
FROM [Orders] AS [t0]
LEFT OUTER JOIN (
SELECT 1 AS [test], [t1].[CustomerID], [t1].[CompanyName], [t1].[ContactName], [t1].[ContactTitle], [t1].[Address], [t1].[City], [t1].[Region], [t1].[PostalCode], [t1].[Country], [t1].[Phone], [t1].[Fax]
FROM [Customers] AS [t1]
) AS [t2] ON [t2].[CustomerID] = [t0].[CustomerID]
WHERE (DATEPART(Year, [t0].[OrderDate]) >= @p0) AND ([t2].[Country] = @p1)
-- @p0: Input Int (Size = 0; Prec = 0; Scale = 0) [1998]
-- @p1: Input NVarChar (Size = 6; Prec = 0; Scale = 0) [France]
-- Context: SqlProvider(Sql2005) Model: AttributedMetaModel Build: 3.5.30428.1
That batch returned the following result in SQL Server Management Studio:
which looked correct to me.
I wasn’t able to execute Matthieu’s queries in LINQPad, but I did execute his two T-SQL queries in SQL Server Mangement Studio.
Here’s the first result, which doesn’t apply the Country = ‘France’ constraint:
And here’s the second batch’s result set from the query that added AsEnumerable():
Which is a LEFT OUTER JOIN and doesn’t look to me like what Matthieu said he was looking for.
I’ve asked Matthieu to clarify his objective for me.
Update: 6/15/2008: In a reply to my comment on Matthieu’s post, he says:
My post wasn't clear so I change it.
I think the difference is that you do a LINQ to SQL query and I do a LINQ To Entities query.
To see if there were differences using an Entity Data Model, I created a simple WinForm app with a Northwind ObjectContext.
My LINQ to Entities query:
using (NorthwindEntities ctxNwind = new NorthwindEntities()) { var query = from o in ctxNwind.Orders where o.OrderDate.Value.Year >= 1998 && o.Customers.Country == "France" select new { OrderID = o.OrderID, OrderDate = o.OrderDate, CustomerID = o.Customers.CustomerID, Country = o.Customers.Country }; dgvQueryResult.DataSource = query.ToList(); }
gave the expected result: 23 rows of orders with customer data in a DataGridView. 17 of the 23 rows are shown here:
I then ran Matthieu’s LINQ to Entities queries with and without AsEnumerable():
using (NorthwindEntities ctxNwind = new NorthwindEntities()) { var query = from oc in from o in ctxNwind.Orders where o.OrderDate.HasValue && o.OrderDate.Value.Year >= 1998 select new { Order = o, Customer = ctxNwind.Customers .Where(c => c.CustomerID == o.Customers.CustomerID && c.Country == "FRANCE").FirstOrDefault() } select new { OrderID = oc.Order.OrderID, OrderDate = oc.Order.OrderDate, CustomerID = oc.Customer.CustomerID, Country = oc.Customer.Country }; dgvQueryResult.DataSource = query.ToList(); }
and
using (NorthwindEntities ctxNwind = new NorthwindEntities()) { var query = from oc in (from o in ctxNwind.Orders where o.OrderDate.HasValue && o.OrderDate.Value.Year >= 1998 select new { Order = o, Customer = ctxNwind.Customers .Where(c => c.CustomerID == o.Customers.CustomerID && c.Country == "FRANCE").FirstOrDefault() }) .AsEnumerable() select new { OrderID = oc.Order.OrderID, OrderDate = oc.Order.OrderDate, CustomerID = oc.Customer.CustomerID, Country = oc.Customer.Country }; dgvQueryResult.DataSource = query.ToList(); }
Both of the preceding two queries returned the same data as shown in the last T-SQL example above: A 270-row outer join with 23 rows containing customer data. Adding the AsEnumerable() Standard Query Operator does nothing that I can determine.
My MezilEntitiesQueriesTest project is available for download here. The project expects to find the Northwind sample database attached to a local SQL Server 2005 Express instance (localhost\SQLEXPRESS).
2nd Update 6/15/2008 1545 PDT: The Light Dawns (sort of) … Bad Things Are Happening Here
Matthieu’s comment clarifies an invisible and probably unwanted effect of adding AsEnumerable() to the first query: My outer select clause returns scalar values from Customers and Orders. His outer select oc.Order clause returns only Orders entities and makes no request for Customer-related data.
Although Matthieu says near the beginning of this post, we “need to load into our context the right orders and customers.” Later, he says “the SQL request gets the orders we want and the customers we want,” the truth is, he doesn’t need or want the customers. If he had wanted the customers, he would have added oc.Customer[.Property] to his select clause.
LINQ to Entities’ (or LINQ to SQL’s) deferred execution implementation doesn’t write an extraordinary amount of T-SQL to return data that isn’t used. Adding AsEnumerable() caused immediate execution of Matthieu’s inner select clause, which includes a requirement for Customers. As a result, a very large number of cycles were wasted.
What’s worse, however, is that adding AsEnumerable() forced local filtering of the 270-row OUTER JOIN result set to include only 23 Customers+Orders instances in the query output.
Don’t try this at work. Also, it’s a good idea to execute the T-SQL batches of complex queries in Query Analyzer (or whatever) now and then to see if your LINQ to SQL or LINQ to Entities query is returning more rows that you’re using.
Posted by Roger Jennings (--rj) at 3:14 PM 6 comments
Labels: Entity Framework, Entity SQL, eSQL, LINQ, LINQ to Entities, LINQ to SQL, T-SQL
Sunday, May 11, 2008
LINQ and Entity Framework Posts for 5/7/2008+
Note: This post is updated daily or more frequently, depending on the availability of new articles.
Dinesh Kulkarni: Removing an Item from a LINQ to SQL Collection Doesn't Delete It
Dinesh observes in his LINQ to SQL Tips 4: Use DeleteOnNull if you want to delete object with null FK post of May 11 that:
If I remove one of the O[r]derDetails from Order.OrderDetails collection, I see that the reference OrderDetail.Order is set to null but this just orphans the OrderDetail; it is not deleted. How can I change that?
Dinesh provides a pair of alternative changes to the generated code to infer an invocation of the DataContext.DeleteOnSumbit() method.
Added: May 11, 2008
Ilan Assayag Finds LINQ's ElementAt Method is Slow
In his Linq: Composite keys don't work + Beware of ElementAt ... post of May 11, 2008, Ilan can't use one of Shital Shah's four approaches for LINQ to Object joins on composite keys to join a LINQ to SQL result and an in-memory List<T> of the same length. So he tried using the ElementAt() Standard Query Operator and found it took "AGES to complete. The solution was to turn the inputs into arrays and process them.
Added: May 11, 2008
EntitySpaces to Support LINQ to SQL Libraries in Next Release
EntitySpaces is a commercial O/RM tool that competes with LINQ to SQL and Entity Framework. The Entity Spaces blog's May 11, 2008 announcement, EntitySpaces 2008 - LINQ to SQL in Next Beta, states that the product's generated classes will include classes from System.Data.Linq and System.Data.Linq.Mapping if you mark the LINQ to SQL Support check box of the Advanced page of the configuration dialog.
Applications that take advantage of this option are limited to the SqlClient provider and SQL Server as the data store. EntitySpaces has its own providers for Microsoft Access, Oracle, MySQL, VistaDB and PostgreSQL.
The EntitySpaces 2008 - LINQ to SQL in Next Beta (Part 2) post of the same date says that you can use LINQ to SQL to populate data into an EntitySpaces collection directly.
Added: May 11, 2008
Beth Massey Demonstrates Finding XML Nodes and Changing their Value with LINQ to XML
Her Quickly Changing Values of XML Elements Using LINQ post of May 9, 2008 shows a simple LINQ to XML query to find a particular node and then change the value of two of its members within a For Each loop.
Applying the SingleOrDefault() Standard Query Operator would be a better alternative if you want to make sure that only one node is updated (see the next post).
Added: May 11, 2008
Jimmy Bogard: Minimize foreach Loops and Temporary Lists with LINQ Query Operators
Jimmy suggests transformations, aggregations, concatenations, and filtering operations as candidates for simplification with LINQ in his LINQ query operators: lose that foreach already! post of May 9, 2008.
Added: May 11, 2008
Darrell Mozingo Proposes Using Data Transfer Objects with LINQ to SQL
His Loading a DTO from LINQ To SQL post of May 9, 2008 starts by defining the the Data Transfer Object (DTO) pattern:
A Data Transfer Object (DTO) is a plain old CLR object (POCO) mainly used to pass data between tiers and remote calls. They’re basically classes with getters, setters, and perhaps a constructor or two, but no methods. They’re dumb objects that simply hold data.
He then goes on to explain the benefits of a DTO as an intermediary between LINQ to SQL objects and other application layers or tiers and then shows the LINQ code to create a DTO for a PersonDTO object with a GetPerson() method. The Business layer then invokes the PersonDTO.GetPerson(personID) method and executes similar code in the business layer.
PersonDTO looks to me to be a DTO with a method and it's definitely read-only.
Martin Fowler recommends considering DTOs only for passing data to remote tiers. He quotes contributing author Randy Stafford:
Don't underestimate the cost of [using DTOs].... It's significant, and it's painful - perhaps second only to the cost and pain of object-relational mapping.
Added: May 11, 2008
Matt Hunter Releases RC1 of LINQ to Entity Base
LINQ to Entity Base is a base class that provides change tracking while disconnected from the entity context and supports WCF. Matt's Version 1.0 Release Candidate 1 of LINQ to SQL Entity Base released!!! post of May 9, 2008 incorporates the following changes;
- Added two static helper methods for serialization/de-serialization of entities.
- Now automatically returns KnownTypes if Entity Base class is in the same assembly as your Entity classes.
- You can now set the initial state of the root entity (e.g. New or Deleted)
- Demo is now in the form of a Client/Server architecture, with WCF used for communication.
- Added "LINQEntityState" property which returns an enum indicating the state of the entity.
You can download LINQ to Entity Base RC1 from CodePlex.
Added: May 9, 2008
Mark Blomsma Builds a Simple RSS Aggregator with LINQ to Objects and WCF
Most RSS/Atom aggregators use LINQ to XML, but Mark's Building a RSS aggregator using LINQ to Objects and WCF 3.5 post of May 9, 2008 takes advantage of WCF's SyndicationFeed.Load() and SyndicationFeed(items) methods to take the prize for code brevity.
Added: May 9, 2008
Julie Lerman Explains the "Platinum Rule" for Attaching an Object Graph
Julie explains that "an object graph (for example when a customer is attached to an order which is attached to some line items) must be completely in or completely outside of the ObjectContext" in her Entity Framework's Golden Rule and its Platinum Rule post of May 8, 2008.
She attached an Order instance to a Customer instance, attached the Customer (but not the Order) to an ObjectContext for change tracking, and found that the Order were attached and being change tracked because the association created an object graph. But if you detach the Customer, the Order (and its LineItems, if present) remain attached.
Julie's Advanced Entity Framework: Take Charge of Object Services session for Tech*Ed North America Developers 2008 conference:
"... drills into how the ObjectContext manages relationships and how you can control its behavior. This is especially important in SOA scenarios where you may need to transport ObjectGraphs. We also look closely at change tracking, focusing on the challenges and solutions for dealing with data concurrency when moving objects across tiers in your enterprise applications.
She's also presenting Microsoft .NET Framework 3.5 Data Access Guidance.
Added: May 9, 2008
Jim Wooley Investigates Using Unmapped Properties in a Mapped Class in a LINQ to SQL Projection
In his Projecting into an unmapped property from a LINQ to SQL query post of May 8, 2008, Jim says:
On page 216 of LINQ in Action, I made a comment that unmapped properties in a mapped class cannot be used in a LINQ to SQL projection. This was true with the beta bits, but only partially true with the release bits.
However, he concludes:
So the question that came up asks if the next printing of the book needs to be adjusted to remove the statement that you can't project an unmapped property. While you can indeed project these properties, you can't use them elsewhere in the query. ... Because of this, I plan to leave the note in the chapter warning you of using the unmapped property in your query.
Added: May 9, 2008
PHP/Rails/MySQL Developer Finds ASP.NET and LINQ to SQL Fast, Cost- Effective and Reliable
Howard Baines, a UK Web design and development agency, says in a Building a web app the Microsoft way post of May 8, 2008:
We’ve worked with PHP, Rails, MySQL and others, and have experienced the highs and lows of doing so. Late last year we decided to fly in the face of popular opinion and build a web app of our own… using the Microsoft platform. ...
We decided to build a meeting organizer application which we called ‘Meet with Approval’. The application allows anyone who wants to arrange a meeting to create a dedicated event page with date and time options and send out email invites. ...
The real bonus came in the form of the InteliSense in Visual Studio which not only made learning and writing LINQ [to SQL] queries fast but also supported our underlying data model. InteliSense within Visual Studio included support for frontend JavaScript which again helped to speed up development. Moreover the InteliSense features really cut down on the number of time wasting runtime errors and helped to reduce bugs. In fact using the various prebuilt controls and functions of .NET helped us to significantly cut down the number of bugs. ..
Within three weeks we were able design, develop and deploy our application at extremely low cost. Since launching in late 2007 we have had only 2 bugs reported with over 2000 users in the system and over 500 meetings created. We felt that we were able to deliver on our objectives of fast, cost effective, reliable development that produced a great user experience which you can judge for yourselves at www.meetwithapproval.com.
Their case study is on the Microsoft website as part of the Heroes Happen Here Launch Evidence: Howard Baines: Start-Up Migrates from Open Source and Develops Web Solution in Just Three Weeks.
Added: May 9, 2008
Matt Hidinger Adds Audit Updates Feature to LINQ to SQL
In his LINQ to SQL Audit Trail post of May 8, 2008, Matt explains his approach to adding an audit trail to a LINQ to SQL application:
Natu[r]ally, there are many ways to tackle this [auditing] problem. In the past I have either relied on writing database Triggers for the tables, or wrote the Auditing logic right into the stored procedure that was doing the modification. But this time, since the project is utilizing LINQ to SQL, I had a more ambitious idea in mind.
The Goal
Create a flexible, and automated Auditing solution that would allow my team to add Audit tracking to any table in our database, with only a single line of C# code.
The sample code, which audits changes to Northwind Products and Categories table, and an ASP.NET demo project is available for download here.
Added: May 9, 2008
LINQ, Entity Framework, ADO.NET Data Services, SQL Server Data Services, Data Synchronization, and SQL Server Compact at Tech*Ed
The following two posts of May 8, 2008 list and describe the sessions at Tech*Ed 2008 North America Developers, June 3-6, 2008, in the Orange County Convention Center, North/South Building, Orlando, Florida, that a related to this blog's primary topics:
- LINQ and Entity Framework Breakout and TLC Sessions at Tech*Ed Developers 2008
- ADO.NET Data Services, SQL Server Data Services and Synchronization Sessions at Tech*Ed Developers 2008
The preceding lists were compiled from the Tech*Ed Developers 2008 Session Catalog.
Added: May 8, 2008 16:00 PST Updated: May 9, 2008
Eugenio Pace Explains How to Enable LitwareHR's Offline Proxy
In his You don't have a beta account with SSDS, but you'd like to use the latest LitwareHR anyway post of May 8, 2008, Eugenio explains that the LitwareHR front-end for SQL Server Data Services also has a proxy to enable off-line development and shows you how to enable it if you don't have an SSDS beta account.
Added: May 8, 2008 16:00 PST
Doug Rothaus Continues his Visual Basic XML Odyssey
Doug's VB XML Cookbook, Recipe 4: Get Inner XML from the Nodes Property (Doug Rothaus) post of May 8, 2008 discusses the differences between System.Xml's InnerText and InnerXML properties:
If you’re familiar with the classes in the System.Xml namespace, you will immediately recognize this behavior. The solution is to embed the inner XML of the element rather than the inner text (the Value property). While the System.Xml classes expose both InnerText and InnerXml properties, LINQ to XML does not. Inner text is returned from the Value property, and inner XML is returned from the Nodes property. Why the difference? The InnerXml property returns a string of XML, which would have to be re-parsed into XML objects. The Nodes property returns a collection of LINQ to XML objects, which you can manipulate or embed as a whole into XML Literals making the Nodes property a powerful tool for transforming XML.
Added: May 8, 2008 16:00 PST
Justin Etheredge Explains the Workings of LINQ's SelectMany Operator
In his The Linq SelectMany Operator post of May 8, 2008, Justin describes how the less commonly used SelectMany Standard Query Operator flattens sequences to avoid nested results.
Added: May 8, 2008 16:00 PST
Randolph Cabral Explores Creating Business Entities from Data Entities
Randolph's Exploring N-Tier Architecture with LINQ to SQL (Part 3 of n) episode of May 8, 2008 explores object composition as practiced by the Gang of Four (GoF) by creating "a business entity layer that abstracts the data entities away and gives us a nice place to add domain logic."
Added: May 8, 2008 16:00 PST
Download Northwind.sdf with Column and Constraint Name Fixes
The version of Northwind.sdf for SQL Server Compact Edition (SSCE) v3.5 installed in the \Program Files\Microsoft SQL Server Compact Edition\v3.5\Samples folder by Visual Studio 2008 has spaces in column and foreign-key constraint names. My uploaded version corrects this defect so Entity Framework and LINQ to SQL sample projects can use the same queries.
The details are in Upgraded Northwind.sdf File for SSCE v3.5 Available for Download of May 8, 2008.
Steve Godbold Analyzes the Number of Connections for LINQ to SQL Queries
In his LINQ to SQL and Connections post of May 8, 2008, Steve counts three openings and closings of the connection to a test database for a query that returns a sequence of 100 entities and another query that adds a row to the table.
No surprises here.
Jimmy Bogard Replaces Filter, Map, and Reduce Functions with LINQ Where, Select and Aggregate
Jimmy shows how to replace function programming's "big three higher-order functions," Filter, Map, and Reduce with LINQ's Where(), Select() and Aggregate() Standard Query Operators (SQOs) in his Enhancing mappers with LINQ post of May 8, 2008.
Note: Map and Reduce are, as you'd expect, the functions behind Google's and other's MapReduce frameworks.
Rob Conery Closes in on the MVC Storefront with Data Updates from the Shopping Cart
Rob says in his MVC Storefront, Part 9: The Shopping Cart post of May 8, 2008:
In this episode I dive into implementing the Shopping Cart in a basic way so I can run a spike to make sure my pattern can push data nicely back into the DB without problems.
He also provides brief descriptions of and links to the previous eight episodes.
Bill Horst Continues His Converting SQL to LINQ in Visual Basic Series
Part 10 is Converting SQL to LINQ, Part 10: Like (Bill Horst) of May 7, 2008. The Like operator is a keyword in VB.
The LINQ version is identical to the keyword's, but the SQL Server LIKE operator doesn't support all VB options, such as # to match any single digit.
Julie Lerman Discusses the Need of Delete Stored Procedures for Foreign Key Parameters
The Entity Framework is very picky about the use of stored procedures for create, update, and delete (CUD) operations. Julie outlines these requirements in her Delete Stored Procs and Navigations in the Entity Data Model post of May 7, 2008 with emphasis on the need to include foreign-key parameters for associations in DELETE stored procedures.
Ryan Dunn Emphasizes Reducing Capital and Operating Expenditures with SQL Server Data Services
In The Business Value of SQL Server Data Services of May 7, 2008, Ryan says:
One of the greatest business values (as opposed to technical values) you get from SSDS (and cloud services in general) is the ability to redeploy the capital to other resources. In the case of CapEx, you can deploy this perhaps to marketing or content creation. In the case of OpEx, those expenses can be redeployed to more useful tasks in the enterprise (creating new systems, upgrading, and expanding operations perhaps?).
I think SQL Server Data Services will need several rounds of feature additions before it's ready to replace most on-premises SQL Server 2005+ installations.
FarPoint Adds LINQ Support to its Spread[sheet] Control for Windows and Web Forms
In a brief FarPoint Spread and LINQ post of May 7, 2008, the FarPoint blog says:
Spread for Windows Forms supports LINQ data sources and can operate in applications that make use of LINQ extensions. You can use LINQ in application development to do queries, use the built-in type checking and IntelliSense it affords, and with Spread for Windows Forms, bind to results retrieved from a LINQ query.
As for Spread for ASP.NET, the .NET framework provides a LinqDataSource control. And, it can be set as the DataSource for Spread for ASP.NET.
Pablo Castro Comments on Entity Framework's Database Independence
His Provider availability enables Entity Framework and Data Services over many database vendors post of May 7, 2008 says, inter alia:
I've been looking at all the buzz about adoption of the ADO.NET Entity Framework provider model and how it enables access to many databases. This is a big deal... we kind of slowed down on database independence with previous versions of ADO.NET. This new round restores the database independent capabilities.
Matt Warren mentions the opposite approach taken for LINQ to SQL in his Mocks Nix - An Extensible LINQ to SQL DataContext post of May 4, 2008:
LINQ to SQL was actually designed to be host to more types of back-ends than just SQL server. It had a provider model targeted for RTM, but was disabled before the release. Don’t ask me why. Be satisfied to know that is was not a technical reason. Internally, it still behaves that way.
Obviously, it was a marketing reason. I'm glad the EF folks took a different approach.
John Papa Promises a Silverlight 2.0 Demo with LINQ to SQL and WCF
John's been concentrating on Entity Framework for quite a while, but says in his Silverlight, WCF and LINQ to SQL post of May 7, 2008:
I've had some requests lately to write some articles that work with LINQ to SQL. I've ben focusing on the Entity Framework a lot due to its vast nature and that fact that its still relatively unknown to many people. So I am going to work on a LINQ to SQL project that ties into a multi tier model. I am also going to show how this can be hooked into a Silverlight 2 UI that connects through WCF to the lower layers.
David Sceppa Refreshes the Entity Framework's Third-Party Data Providers List for Beta 3
Dave's Recent ADO.NET Entity Framework provider news - Demos and downloads post of May 6 lists the status of third-party data providers that support Entity Framework Beta 3:
- Core Lab (Oracle, MySQL, PostgreSQL and SQLite)
- IBM (DB2, IDS and U2)
- MySQL (MySQL)
- Npgsql (PostgreSQL)
Oracle and DataDirect are conspicuous by their absence.
Avi Wortzell Shows You How to Add ADO.NET 2.0's SqlBulkCopy Feature to LINQ to SQL
In Implementing SqlBulkCopy in Linq to Sql of May 6, 2008, Avi shows you how to write a LinqBulkCopyReader as a generic class which inherits from the SqlBulkCopyReader class that lets you do bulk inserts with LINQ to SQL.
Roy Osherowe Implements LINQ to Regex
Roy takes Josh Flanagan's Readable Fluent Regex Api and wraps it with LINQ in his Introducing LINQ To Regex post of May 6, 2008. Fluent Regex syntax, which Josh designed for readability, resembles LINQ's composable method syntax, which simplified the process. The source code and binaries are available here.
Very handy, especially if you're not a Regex guru.
Posted by Roger Jennings (--rj) at 9:19 AM 0 comments
Labels: ADO.NET Data Services, Astoria, EDM Designer, Entity Data Model, Entity Data Model Designer, Entity Framework Beta 3, Entity SQL, LINQ to Entities, LINQ to Entity Base, LINQ to SQL, SQL Server Compact Edition, SQL Server Data Services, SSCE, SSDS, Stored Procedures, T-SQL