Saturday, July 31, 2004

TechNet Scripts: Listing Properties and Methods of Win32 Classes

TechNet Scripts: Listing Properties and Methods of Win32 Classes: "Listing the Properties and Methods of the Win32 Classes

Description
Returns the properties and methods for all the WMI Win32 classes (for example, Win32_Service, Win32_Process, Win32_NTEventLog, etc.).
Script Code
strComputer = '.'
Set objWMIService = GetObject('winmgmts:\\' & strComputer & '\root\cimv2')
For Each objclass in objWMIService.SubclassesOf()
intCounter=0
If Left(objClass.Path_.Class,5) = 'Win32' Then
For Each Qualifier in objClass.Qualifiers_
If UCase(Trim(Qualifier.Name)) = 'ASSOCIATION'
Then
intCounter = 1
End If
Next
If x = 0 Then
strComputer = '.'
Set objWMIService = GetObject _
('winmgmts:{impersonationLevel=impersonate}!\\' & _
strComputer & '\root\cimv2')
Set strClass = objWMIService.Get(objClass.Path_.Class)
Wscript.Echo 'PROPERTIES:'
For each strItem in strClass.properties_
Wscript.Echo objClass.Path_.Class & vbTab & strItem.name
Next
Wscript.Echo 'METHODS:'
For Each strItem in strClass.methods_
Wscript.Echo objClass.Path_.Class & vbTab & strItem.name
Next
End If
End If
Next"

TechNet Scripts: Determining Local Time on a Computer

TechNet Scripts: Determining Local Time on a Computer: "Determining Local Time on a Computer

Description
Reports the local time on a computer.
For more information about the Win32_LocalTime class used in this script, click here.
Supported Platforms
Windows XPYes
Windows Server 2003Yes
Windows 2000No
Windows NT 4.0No
Windows 98No

Script Code
strComputer = '.'
Set objWMIService = GetObject('winmgmts:\\' & strComputer & '\root\cimv2')
Set colItems = objWMIService.ExecQuery('Select * from Win32_LocalTime')
For Each objItem in colItems
Wscript.Echo 'Day: ' & objItem.Day
Wscript.Echo 'Day Of the Week: ' & objItem.DayOfWeek
Wscript.Echo 'Hour: ' & objItem.Hour
Wscript.Echo 'Milliseconds: ' & objItem.Milliseconds
Wscript.Echo 'Minute: ' & objItem.Minute
Wscript.Echo 'Month: ' & objItem.Month
Wscript.Echo 'Quarter: ' & objItem.Quarter
Wscript.Echo 'Second: ' & objItem.Second
Wscript.Echo 'Week In the Month: ' & objItem.WeekInMonth
Wscript.Echo 'Year: ' & objItem.Year
Next"

TechNet Scripts: Schedule a Task

TechNet Scripts: Schedule a Task: "Schedule a Task

Description
Schedules Notepad to run at 12:30 PM every Monday, Wednesday, and Friday.
Script Code
strComputer = '.'
Set objWMIService = GetObject('winmgmts:' _
& '{impersonationLevel=impersonate}!\\' & strComputer & '\root\cimv2')
Set objNewJob = objWMIService.Get('Win32_ScheduledJob')
errJobCreated = objNewJob.Create _
('Notepad.exe', '********123000.000000-420', _
True , 1 OR 4 OR 16, , , JobID)
Wscript.Echo errJobCreated"

TechNet Scripts: Check Registry Key Access Rights

TechNet Scripts: Check Registry Key Access Rights: "Check Registry Key Access Rights

Description
Uses WMI to check access rights for the logged on user to HKLM\SYSTEM\CurrentControlSet.
Script Code
const KEY_QUERY_VALUE = &H0001
const KEY_SET_VALUE = &H0002
const KEY_CREATE_SUB_KEY = &H0004
const DELETE = &H00010000
const HKEY_LOCAL_MACHINE = &H80000002
strComputer = '.'
Set StdOut = WScript.StdOut
Set oReg=GetObject('winmgmts:{impersonationLevel=impersonate}!\\' &_
strComputer & '\root\default:StdRegProv')
strKeyPath = 'SYSTEM\CurrentControlSet'
oReg.CheckAccess HKEY_LOCAL_MACHINE, strKeyPath, KEY_QUERY_VALUE, bHasAccessRight
If bHasAccessRight = True Then
StdOut.WriteLine 'Have Query Value Access Rights on Key'
Else
StdOut.WriteLine 'Do Not Have Query Value Access Rights on Key'
End If
oReg.CheckAccess HKEY_LOCAL_MACHINE, strKeyPath, KEY_SET_VALUE, bHasAccessRight
If bHasAccessRight = True Then
StdOut.WriteLine 'Have Set Value Access Rights on Key'
Else
StdOut.WriteLine 'Do Not Have Set Value Access Rights on Key'
End If
oReg.CheckAccess HKEY_LOCAL_MACHINE, strKeyPath, KEY_CREATE_SUB_KEY, bHasAccessRight
If bHasAccessRight = True Then
StdOut.WriteLine 'Have Create SubKey Access Rights on Key'
Else
StdOut.WriteLine 'Do Not Have Create SubKey Access Rights on Key'
End If
oReg.CheckAccess HKEY_LOCAL_MACHINE, strKeyPath, DELETE, bHasAccessRight
If bHasAccessRight = True Then
StdOut.WriteLine 'Have Delete Access Rights on Key'
Else
StdOut.WriteLine 'Do Not Have Delete Access Rights on Key'
End If"

TechNet Scripts: Add a Support URL to an Event Log Entry

TechNet Scripts: Add a Support URL to an Event Log Entry: "Add a Support URL to an Event Log Entry

Description
Writes an event to the Application event log that includes a support URL. Requires Windows XP or Windows Server 2003.
Script Code
Const EVENT_FAILED = 1
Set objShell = Wscript.CreateObject('Wscript.Shell')
objShell.LogEvent EVENT_FAILED, _
'Payroll application could not be installed.' _
& 'Additional information is available from http://www.fabrikam.com.'"

Friday, July 30, 2004

Adding a Buffer Around a MapPoint Web Service Map

When a user requests a map containing a certain set of points, you can use one of four classes to return a map view: ViewByBoundingLocations, ViewByHeightWidth, ViewByScale, or ViewByBoundingRectangle. Only one of these classes, ViewByBoundingLocations, provides a buffer. This buffer is meant to prevent the icon images from being truncated and from appearing on the edges of the map. The default buffer, which is fixed, can usually prevent icons from being truncated. However, in certain situations, the default buffer may not work.

This article describes how to use the ViewByBoundingRectangle class to add a buffer around MapPoint Web Service maps, ensuring that all map icons are shown in their entirety.

Thursday, July 29, 2004

Longhorn Developer Center Home: Understanding WinFS by Exploring the WinFS Type System

Longhorn Developer Center Home: Understanding WinFS by Exploring the WinFS Type System: "WinFS is a storage platform capable of multi-master synchronization, designed to run on hundreds of millions of computers, to be accessible, and to support international users. If this were not enough, WinFS bridges the worlds of file, relational, and XML data. This is about as far from a plain 'Hello World!' program as you can get.
One way to begin to understand a storage environment of this level of sophistication is to deconstruct the software architecture and explore its essential parts. One of the most fundamental parts of a storage system is its underlying data model. A data model is a complete system for describing how data can be represented and how it should be stored.
Programmers primarily interact with the WinFS data model by using the WinFS type system, rather than interacting directly with the data model. A type system, especially a type system like the .NET Common Type System (CTS), is essentially a model that defines the rules the runtime follows when creating, using, and managing types. The good news is that the WinFS data model and the WinFS type system have a lot in common�so much in common, in fact, that you can safely assume that they largely mirror each other.
This column explores the type system used when programming the WinFS platform. Programmers who program in .NET are already familiar with type systems; after all, the common language specification defines the CTS. It is impossible to write even a 'Hello World!' program in .NET without using at least one type. However, even in simple code, type systems are important. Moving beyond the world of developers, type systems are important to users who understand their data using types that developers do not normally work with, such as Word documents, PowerPoint slides, and e-mail. WinFS introduces a type system that allows the developer to model more complex data while hiding that complexity from the end user."

.NET Architecture Center: Microsoft Patterns: Describing the Enterprise Architectural Space

.NET Architecture Center: Microsoft Patterns: Describing the Enterprise Architectural Space: "This document presents an organizing table that describes the enterprise architectural space, shows relationships among artifacts in the space, and demonstrates how different roles in your enterprise view enterprise architecture. This document also demonstrates how pattern authors can use the organizing table to organize existing patterns and to identify areas where patterns are not currently documented."

Data Access in ASP.NET 2.0

ASP.NET 2.0 reduces the amount of code required to perform common data access tasks by adding a number of data-enabled controls. This article shows these new controls, and how you can use them in your applications.

Data access has always been a crucial aspect of Web application development. Data-driven Web pages are essential for almost every business application. Because data access is so prevalent, it makes little sense for developers to spend time continually re-generating complex code for simple database tasks. Developers need to be able to rapidly access data from a variety of sources in a variety of formats. Fortunately, ADO.NET 2.0 and the new data access controls in ASP.NET 2.0 help solve this problem.

For traditional ASP and ASP.NET 1.1 applications, the developer had to create code to access and update a database, as well as to format the retrieved data as browser-friendly HTML. Although Visual Studio .NET had a wizard to help with this task, advanced features such as paging and sorting still required intricate synchronization between the back-end code and the front-end display. Quite often, this code was difficult to maintain and synchronize, especially if the database changed or additional data had to be displayed on the page. In addition, the proliferation of XML as a data store required extensive lines of plumbing code intermixed with data access logic.

To increase developer productivity and the performance of Web applications, ASP.NET 2.0 reduces the code needed to access and display data, by encapsulating functionality in new data controls that give you more control and flexibility with your data. These controls can be linked to a wide variety of data sources, ranging from traditional databases to XML data stores. All of the data sources are treated in a similar fashion, greatly reducing the complexity of developing data-driven applications. The ASP.NET 2.0 internal support for these features required extensive architectural enhancements. The new data source objects incorporate a number of industry accepted best practices to augment a very robust foundation. The data access tools available in ASP.NET 2.0 can now be leveraged by the most complex applications. Binding and caching issues that limited ASP.NET 1.x implementations have been resolved in ASP.NET 2.0, both architecturally and mechanically.

Sunday, July 25, 2004

Split the Name field into First and Last name - Cold Fusion, Allaire

Ever want to take the name field in your table and turn it into first_name , last_name fields? Read on... This script alters a database table to add two new fields, First name and Last name, it then takes the "Name" field's data and rips out the first name and last name to insert into the database.

Inside Encrypting File System, Part 1

Securing a computer system entails employing measures that protect the computer's data from viewing or manipulation by unauthorized users. Security measures at the network interface prevent intruders from gaining entry to the computer, and file-system security prevents the computer's authorized users from accessing data they're not supposed to access. However, a computer that is isolated from the Internet behind a firewall and that has stringent file-system security policies in place remains unsecured if no strategy exists to guard the computer's physical security. If unauthorized users have physical access to a computer, they can remove the computer's hard disks and perform offline analysis of the disks' data. When users can view a hard disk's contents on a different computer, file-system security (e.g., the kind NTFS ACLs provide on Windows NT or Windows 2000— Win2K—systems) is of no value. This problem is especially acute for laptop computers because two NTFS file-system drivers that ignore NTFS security—NTFSDOS and an NTFS driver for Linux—let even casual thieves easily view NTFS files.

One way to address the physical security problem is to keep computers in locked rooms, but this solution is obviously not practical for laptop computers, whose main purpose is portability. Thus, to prevent access to file data in situations in which bypassing file-system security is a possibility, data encryption is necessary. Before Win2K, NT users have had to turn to third-party vendors for encryption solutions, but in Win2K a built-in encryption facility for NTFS files exists in the form of Encrypting File System (EFS). By building encryption into the OS, Microsoft can make the encryption and decryption process transparent to both applications and users.

Unfortunately, Microsoft has produced little documentation describing how EFS works. Because many people will undoubtedly rely on EFS to secure their sensitive data, having a solid understanding of what goes on under the hood is important. In this two-part series about EFS, I'll take you beneath the surface and let you know exactly how EFS works with NTFS and Win2K cryptography facilities to help you keep your data safe from prying eyes. This month, I provide an overview of EFS and begin walking you through the process by which EFS encrypts files. Next month, I'll finish the encryption walk-through, describe the decryption process, and introduce the data recovery mechanism EFS has built into the decryption process.

(.NET - VB.NET) How Do I...Read XML from a file in VB.NET

This sample illustrates how to read XML from a file using the XmlTextReader class. This class provides direct parsing and tokenizing of XML, and implements the W3C Extensible Markup Language (XML) 1.0 and the Namespaces in XML specifications.

XmlReader class is the API that provides XML parsing, the XmlTextReader is the implementation designed to handle byte streams.

Typically, you use the XmlTextReader if you need to access the XML as raw data without the overhead of a DOM. Not having to access the DOM results in a faster way to reading XML. For example, an XML document could have a header section used for routing the document for processing elsewhere. The XmlTextReader has different constructors to specify the location of the XML data. This sample loads XML from the books.xml file, as shown in the following code.

Dim reader As XmlTextReader = New XmlTextReader ("books.xml")

Once loaded, the XmlTextReader moves across the XML data by using the Read method sequentially retrieving the next record from the document. The Read method returns false if there are no more records.

........

Friday, July 23, 2004

Download details: Lookout V1.2

Lookout
Steve Makofsky noticed that Lookout is available from the Microsoft Download Center. If you're an Outlook user, run, don't walk to get this thing. It has saved me at least an hour a day since I started using it. Here's some info and a link:
Lookout
Lookout is lightning-fast search for your email, files, and desktop integrated with Microsoft Outlook. Built on top of a powerful search engine, Lookout is the only personal search engine that can search all of your email from directly within Outlook - in seconds...

posted @ 7/23/2004 2:55 AM by Brian Johnson

Lookout download on Microsoft.com
Microsoft's acquisition of Lookout caused a bit of a stir in the community - in particular, because they pulled the product download off of the web site with no real explanation of the product's future or whether it would ever resurface (while allowing existing users to continue using the product).

Guess what just showed up on the Microsoft Download site. You guessed it.

Now, Microsoft has a history of posting stuff on the download site and then quickly removing it. So who can say how long it will stick around. But it's there for the moment.

People have also sleuthed out the (unadvertised) download URL on the Lookout site - presumably there for the Lookout auto-upgrade process. It's appearance on the Microsoft site, however, it potentially far more interesting.





posted @ 7/22/2004 6:33 PM by Kevin Dente

Macromedia - Developer Center : Caching Queries to Disk or to Memory with ColdFusion

As most ColdFusion developers know, Macromedia ColdFusion provides a built-in mechanism for caching queries. Adding a simple cachedWithin attribute to any cfquery tag in ColdFusion lets you cache a query in memory for a specified time.

If you have ever cached a query using this simple built-in functionality, you are certainly familiar with the challenges and limitations that result from implementing query caching with this method. This type of query caching uses a ColdFusion Administrator setting that limits the maximum number of queries that can exist in the cache at any given time. Additionally, clearing cached queries with this method is a much more difficult task than you would expect. In this article, I explain a custom tag approach for caching queries to memory or to disk, providing more flexibility than built-in ColdFusion methods.

Before continuing any further, note that that there is no substitute for solid database design when it comes to query performance. Often times, you can improve query performance through simple revisions to your SQL statement or taking advantage of the power of your RDBMS by adding database enhancements such as indexes to queried columns. Additionally, the alternative method of caching queries to disk described in this article should not be considered a “best practice,” not to mention the fact that you can add memory cheaply to servers based on current RAM prices.

Consider my method of caching queries to disk (one of the options with my custom tag) as a workaround for certain scenarios you might be facing as a developer. I originally developed the custom tag in this article for a client of mine where, at the time, I did not have the option of altering the database design to improve query performance. Additionally, many of the queries where I applied this custom tag involved complex SQL logic including aggregate functions and a high number of nested inner/left joins across a number of tables. While the resulting final record sets were small to medium size, the query execution was unbearably slow and only got worse with heavier server load. In these cases, I was able to apply this custom tag and drastically improve upon the original execution time of the query. In other cases, however, I found that this custom tag actually executed slower than the original query because the WDDX packets written to disk were so large—as a result of very large record sets or even queries with many large columns. As a developer, always evaluate query performance on a case-by-case basis before and after applying this caching solution.

Macromedia - Developer Center: Video Tutorial Series: Building Your First Database Application with ColdFusion

In this six-part series, you'll learn how to build a database application with ColdFusion and Dreamweaver. To watch the tutorials before following along with the steps, download Macromedia Flash Player (in the Requirements section below). Otherwise, download and install all the software in the Requirements section so that you can watch and follow the steps within the tutorials.

Macromedia - ColdFusion MX : Feature Tour

This is a feature tour of Macromedia CF MX 6.1, a multimedia tour. It explains the different new features of the product clearly.

Thursday, July 22, 2004

Document Object Model (DOM) Level 3 Core Specification

Abstract
This specification defines the Document Object Model Core Level 3, a platform- and language-neutral interface that allows programs and scripts to dynamically access and update the content, structure and style of documents. The Document Object Model Core Level 3 builds on the Document Object Model Core Level 2 [DOM Level 2 Core].

This version enhances DOM Level 2 Core by completing the mapping between DOM and the XML Information Set [XML Information Set], including the support for XML Base [XML Base], adding the ability to attach user information to DOM Nodes or to bootstrap a DOM implementation, providing mechanisms to resolve namespace prefixes or to manipulate "ID" attributes, giving to type information, etc.

Status of this document
This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.

This document contains the Document Object Model Level 3 Core specification and is a W3C Recommendation. It has been produced as part of the W3C DOM Activity. The authors of this document are the DOM Working Group participants. For more information about DOM, readers can also refer to DOM FAQ and DOM Conformance Test Suites.

It is based on the feedback received during the Proposed Recommendation period. Changes since the Proposed Recommendation version and an implementation report are available. Please refer to the errata for this document, which may include some normative corrections.

Comments on this document should be sent to the public mailing list www-dom@w3.org (public archive).

This is a stable document and has been endorsed by the W3C Membership and the participants of the DOM working group. The English version of this specification is the only normative version. See also translations.

Patent disclosures relevant to this specification may be found on the Working Group's patent disclosure page. This document has been produced under the 24 January 2002 CPP as amended by the W3C Patent Policy Transition Procedure. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) with respect to this specification should disclose the information in accordance with section 6 of the W3C Patent Policy.

Latest from W3.ORg on XML DOM

Document Object Model (DOM) Level 1 Specification

This is a url for the official Document Object Model (DOM) Level 1 Specification at w3.org.

This is for the release in 1998.

Tuesday, July 20, 2004

Going Virtual II: Remote Debugging

Learn how to set up remote debugging when you are running the Windows XP Embedded image inside Virtual PC.

Last month we took a look at booting and running Windows XP Embedded in Microsoft Virtual PC. There are a number of advantages to booting and running Windows XP Embedded in the Virtual PC environment. First, you don't need additional hardware. Second, you can build and test your Windows XP Embedded image on your development PC without needing to dual-boot your development PC. The other really neat thing about running the Windows XP Embedded image inside Virtual PC is the ability to remote debug applications on your development PC, which is going to be the focus for this month's article.

Why would you need to debug an application running on Windows XP Embedded? Since XPE is a componentized version of Windows XP Professional there should be binary compatibility for the applications you are already running on the desktop, right? The answer is "yes," your applications 'should' run, unmodified, on Windows XP Embedded. This assumes that you have included all the required operating system components, and have added all the support files your application needs (DLLs, fonts, bitmaps, audio files, movie files, data files, etc.).

For this month's article we're going to write a fully featured Win32 application (a typical Hello World application) and build and deploy this application to a Windows XP Embedded operating system image, then debug the application.

The steps outlined in this article are of course applicable to a Windows XP Embedded image running on "real" hardware. Time to buckle up, let's get cracking with writing and debugging an application.

Monday, July 19, 2004

(Visual Basic) Data Encryption and Decryption

as a part of working on our applciation, I am searching for some encryption tools and in the process I landed here at MSDN< again, the inbuilt Crpytography Feature.

Excerpt:

Data Encryption and Decryption

Encryption is the process of translating plain text data (plaintext) into something that appears to be random and meaningless (ciphertext). Decryption is the process of converting ciphertext back to plaintext.


To encrypt more than a small amount of data, symmetric encryption is used. The symmetric key or session key is used during both the encryption and decryption processes. To decrypt a particular piece of ciphertext, the key that was used to encrypt the data must be used. Essentially, a session key consists of a random number, from 40 to 2,000 bits in length. The longer the key, the more difficult it is to decrypt a piece of ciphertext without possessing the key.

The goal of every encryption algorithm is to make it as difficult as possible to decrypt the generated ciphertext without using the key. If a really good encryption algorithm is used, there is no technique significantly better than methodically trying every possible key. Even a key size of just 40 bits works out to just over one trillion possible keys.

It is difficult to determine the quality of an encryption algorithm. Algorithms that look promising sometimes turn out to be very easy to break, given the proper attack. When selecting an encryption algorithm, it is often a good idea to choose one that has been around for a while, and has successfully resisted all attacks.

For more information, see Data Encryption/Decryption Functions.

(.NET - XML) An Introduction to the XML Tools in Visual Studio 2005

This article is an introduction to the XML Editor and the XSLT Debugger in Visual Studio 2005 (formerly known by the codename "Whidbey").

Introduction
With wider adoption of XML, XSLT, XSD Schemas, and other applications, XML is being touched by developers at various places of the application. This mainstreaming of XML requires that developers be supported with better development tools. Visual Studio 2005 significantly improves the XML editing and XSLT Debugging experiences.

The XML editor includes the following functionality:

Design time well-formedness and validation errors.
Validation support for Schema, DTD, and XDR.
Inferring an XSD Schema from an XML instance.
Converting a DTD or XDR to XSD Schema.
Context-sensitive Intellisense.
XSLT editing, viewing the results of the transform.
Standard Visual Studio code-editing, such as outlining and commenting or un-commenting.
The XSLT debugger comprises the following functionality:

Invoking the debugger from the XML Editor.
The ability to set and remove breakpoints.
Standard Visual Studio debugger function key and menu bindings (F9 for setting/removing breakpoints, F11 for "Step In," and so on).
Viewing the output of the transform as it is being generated.
Locals, Watch, and Call stack windows.
Stepping into the XSLT from a C# (or any other CLR language) program.
The remainder of this article will explain the above listed features in more detail, and will also be a mini-tutorial for the XML Editor and the XSLT Debugger.

(.NET - VB.NET) A Sneak Preview of Visual Basic 2005

Provides on overview of the new features in Visual Basic 2005 including, My Visual Basic, IntelliSense, Edit and Continue, AutoCorrect, Just My Code, Windows Forms enhancements, and more. (30 printed pages)

Introduction
The next release of Microsoft Visual Studio has been significantly improved for Visual Basic developers by adding innovative language constructs, new compiler features, dramatically enhanced productivity and an improved debugging experience. Visual Studio 2005 includes several productivity enhancements including IntelliSense code snippets, Windows Forms designer updates, IntelliSense filtering, debugger data tips, Exception Assistant, and more. In language innovations, the 2005 release of Visual Basic includes generics, unsigned types, operator overloading, and many other additions. This document samples some of the new capabilities available in the 2005 version of Visual Basic.

Productivity Enhancements
Visual Studio 2005 and Visual Basic 2005 add many features and tools that make your development experience more productive. The following sections outline just a few.

My
Imagine being able to find the functionality you need within the huge set of classes available as part of the .NET Framework immediately. Imagine using a single reference to accomplish goals that would otherwise require many lines of code. Imagine being more productive than you could ever have been programming in previous versions of Visual Basic or Visual Basic 6. These goals, and more, have been met with the addition of the My to Visual Basic 2005.

(.NET - XML) The XML Diff and Patch GUI Tool .NET FRamework 1.1

This article shows how to use the XmlDiff class to compare two XML files and show these differences as an HTML document in a .NET Framework 1.1 application. The article also shows how to build a WinForms application for comparing XML files.

Introduction
There is no good command line tool that can be used to compare two XML files and view the differences. There is an online tool called XML Diff and Patch that's available on the GotDotNet website under the XML Tools section. For those who have not, you can find it at Microsoft XML Diff and Patch 1.0. It is a very convenient tool for those who want to compare the difference between two XML files. Comparing XML files is different from comparing regular text files because one wants to compare logical differences in the XML nodes not just differences in text. For example one may want to compare XML documents and ignore white space between elements, comments or processing instructions. The XML Diff and Patch tool allows one to perform such comparisons but it is primarily available as an online web application. We cannot take this tool and use it from command line.

This article focuses on developing a command-line tool by reusing code from the XML Diff and Patch installation and samples. The tool works very similar to the WinDiff utility; it presents the differences in a separate window and highlights them.

The XML Diff and Patch tool contains a library that contains an XmlDiff class, which can be used to compare two XML documents. The Compare method on this class takes two files and either returns true, if the files are equal, or generates an output file called an XML diffgram containing a list of differences between the files. The XmlDiff class can be supplied an options class XmlDiffOptions that can be used to set the various options for comparing files.

Wednesday, July 14, 2004

Longhorn Developer Center Home: Longhorn Developer Center Home: The Avalon Control Content Model

Jeff Bogdan explores Avalon's control content model, a concept that can be nearly missed by the developer on first encounter, but one that provides extreme flexibility when meeting the demands of richer scenarios.

There's an obvious property that UI developers expect to find on the controls they're using: Text. Win32 gave WM_GETTEXT and WM_SETTEXT messages for HWNDs, and most frameworks built on top of HWNDs exposed this functionality. Yet in Avalon, we intentionally left Text out of the Control API. This has raised enough questions to warrant an explanation into our motivations behind this design. Obviously, it goes much deeper than one property. In Avalon, we've used two principles to shape our control content model:


The developer should experience a smooth progression when moving from simpler to richer content.
Data is a first class citizen for control content.

Monday, July 12, 2004

(COM+) .NET Enterprise Services Performance

.NET Enterprise Services Performance
Richard Turner, Program Manager, XML Enterprise Services
Larry Buerk, Program Manager, XML Enterprise Services
Dave Driver, Software Design Engineer, XML Enterprise Services

Microsoft Corporation

March 2004

Applies to:
COM+ components
Microsoft .NET Enterprise Services

Summary: See the performance of native COM+ and .NET Enterprise Services components when applied to different activation and calling patterns. Get guidelines to make .NET Enterprise Services components execute just as quickly as C++ COM+ components, and get key recommendations to help you create high-performance .NET Enterprise Service components.

Introduction
Developers who consider moving their COM+ code from "native" Visual C++® or Visual Basic® 6 to managed .NET Enterprise Services components sometimes raise concerns such as:

Why should I switch to managed code?
How much change will be required to my code?
How will my Enterprise Services components perform?
What is the future roadmap for COM+ and .NET Enterprise Services?
This paper discusses the points above, and particularly focuses on the performance question. Resources listed in Appendix 4: Further Reading discuss these subjects in more detail.

This document is targeted at developers and architects who have developed COM+ components and are considering migrating their code to .NET Enterprise Services.

Web Development - About Element Behaviors

Element behaviors are one of the most significant new capabilities in Microsoft® Internet Explorer 5.5. They provide the capability to define custom elements, which can be used in the same way as normal HTML elements in a Web page. An element behavior can be written in script using an HTML Component (HTC) file, or it can be implemented with a binary Dynamic HTML (DHTML) behavior. Element behaviors are encapsulated components, so they can add new and interesting functionality to a Web page while improving the organization of content, functionality, and style.

DHTML behaviors were introduced in Internet Explorer 5 and made it possible to modify the behavior of standard HTML elements by setting the behavior attribute of a Cascading Style Sheets (CSS) entry or by using the addBehavior method in script. DHTML behaviors, in the form introduced in Internet Explorer 5, are now referred to as attached behaviors, to distinguish them from element behaviors—which use a different binding mechanism and have other unique characteristics.

Element behaviors can be used to implement anything from a simple rollover effect to a complex interactive component. A special processing instruction is used to import an element behavior into a Web page, where it is synchronously bound to a custom element. Once an element behavior has been downloaded and parsed, it exists as a first-class element in the document hierarchy and remains permanently bound to the custom element. Element behaviors differ significantly from attached behaviors in this respect. An attached behavior binds asynchronously to an element and modifies its behavior, and it can be attached or removed programmatically.

Element behaviors bring several enhancements to the Behavior Component Model in Internet Explorer 5.5. They provide new features that complement the existing capabilities of attached behaviors and provide additional robustness and reliability. However, attached behaviors are not superceded by element behaviors and remain useful in many scenarios.

A special feature of element behaviors, called viewlink, enables a document tree to be encapsulated in an HTC file, separate from the content of the main Web page. This opens up another realm of possibilities that are covered separately in the ViewLink Overview.

An element behavior can be implemented with an HTC file or a binary DHTML behavior, and the techniques used in each approach are quite different. This article is focused on writing element behaviors with HTC files.

Web Development - About Client Capabilities

Web developers are constantly searching for ways to design and implement sites that deliver the best user experience possible. One way to enhance user experience is to customize content based on capabilities that the client browser supports. For example, when a client-side script detects a low-bandwidth modem connection to the server, it may choose to request low-resolution images from the server to minimize bandwidth consumption.

Client capabilities consists of information about the browsing environment, such as screen resolution, screen dimensions, color depth, CPU, or connection speed. Microsoft® Internet Explorer 4.0 exposed client capabilities as properties through the Dynamic HTML (DHTML) Object Model. Internet Explorer 5 enhanced this further to include a means to install browser components on demand. Beginning with Internet Explorer 5, all this information was encapsulated into DHTML behaviors and made available as one of the browser's default behaviors.

By making this information available on the client, pages can be cached, server roundtrips minimized, server resources freed up as content generation shifts back to the client, and overall performance improved.

This article outlines the benefits introduced by a client-side solution and discusses the details involved in obtaining client capabilities information from the client. Realizing that a client-side solution is not suited for every Web site or Web application, a solution for server-side developers is also provided.

(.NET - ASP.NET) 317515 - HOW TO: Dynamically Create Controls in ASP.NET with Visual Basic .NET

Creation of controls dynamically is one of the main features in ASP.NET. This is a QB from Microsoft Support that explains the process of creating the controls in ASp.NET at run time, using VB.NET

Friday, July 09, 2004

(.NET - ADO.NET) Asynchronous Command Execution in ADO.NET 2.0

Get an overview of the new asynchronous execution functionality in ADO.NET 2.0, the scenarios it was developed to enable, plus some of the issues to keep in mind when using this feature.

In the 2.0 release of ADO.NET, we not only wanted to make existing scenarios easier, but also enable new scenarios that either were just not possible before, or were far from ideal.

Asynchronous command execution is a good example of that. In releases of ADO.NET before 2.0, it wasn't possible to execute a command and not wait for it to complete before continuing execution. The addition of an asynchronous API enables scenarios where it is important for the application to continue execution without waiting for the database to complete the operation.

In this article, I'll cover the basics of the asynchronous database API and a couple of scenarios where this API is useful. Although we designed the API to work with any data access provider, SqlClient—the .NET data access provider for SQL Server—is the only one of the four providers included with .NET that actually supports it. Because of that, I'll use SqlClient throughout the rest of the article in the samples and descriptions of methods and classes. Bear in mind that third-party provider writers can implement this asynchronous API as well, so we may see more databases that can be accessed asynchronously. Simply change the class and method names accordingly to use these samples with other databases.

(.NET - ADO.NET) ADO.NET 2.0 Feature Matrix

ADO.NET 2.0 includes a new base-class provider model, features for all providers, and changes to System.Data.SqlClient. Get an overview of these new features, examples of their use, and a chart of which features are provider-neutral and SqlClient-specific.

ADO.NET 2.0 comes with a plethora of new features. This includes a new base-class–based provider model and features that all providers can take advantage of, as well as changes that are specific to System.Data.SqlClient. Because the .NET Framework 2.0 is being released in conjunction with SQL Server 2005, some of these features require SQL Server 2005 to be usable. This article is meant to serve as an overview and roadmap of the new features, give examples of their use, and includes a chart of which features are provider-neutral and which are SqlClient-specific. In future articles in this series, I'll be going over some of the features in greater detail. In addition, there are many new features of the DataSet and friends; these will be covered in future articles.

(.NET - ASP.NET) Personalization with ASP.NET 2.0

Create personalized applications faster and build entirely new classes of applications with the new personalization features in ASP.NET 2.0.

The variety of business solutions that can now be serviced by Web based applications is continually increasing. With added functionality and faster network connectivity, the infrastructure of the Internet now supports increasing numbers and types of users. These users access the Internet for varying reasons through many different devices, making the traditional 'anonymous' single user Web architecture insufficient.

To keep pace with the new demands for more flexible, user-oriented Web applications, Microsoft ASP.NET 2.0 includes an extensive personalization framework. The ASP.NET 2.0 personalization features include new mechanisms for identifying and registering users, tailoring a Website to a specific user, and storing user information automatically and transparently. Web Parts empower the user to include the information they deem most valuable. Through layout control and inclusion of relevant tools, users can create a site that streamlines their experience. These customizations can be persisted and made available to those users on subsequent visits to the site.

This white paper provides a technical overview of new features, including Web Parts, authentication controls, and personalization providers. If you are a beginning or mid-level developer, or just interested in the new personalization features of ASP.NET 2.0, you will benefit from the code examples and feature descriptions in this white paper.

(.NET - ASP.NET) More and Less: How ASP.NET 2.0 Features Compare Against the Starter Kit Reference Applications

The ASP.NET Starter Kits are five sample ASP.NET applications that provide code to accomplish common Web development tasks. This article looks at how the new features in ASP.NET 2.0 could be used to simplify the Starter Kits

The ASP.NET Starter Kits are a collection of five sample applications that show Web developers how to take advantage of ASP.NET 1.x features. The Starter Kits include reference implementations for important features such as interactive HTML reports; skinning and themes; authentication and authorization; mobile Web browser support; and much more. An hour spent exploring the source code from a starter kit can save many hours of programming and debugging.

In ASP.NET 2.0, Microsoft has introduced a significant number of enhancements and new features to greatly improve developer productivity and dramatically reduce the lines of code that a Web developer must generate. A number of ASP.NET 2.0 features actually make several key features of the starter kits trivial to implement. This article introduces some of the most prominent new features in ASP.NET 2.0, and contrasts the simplicity of using the ASP.NET 2.0 features against the amount of code required to implement similar features in the ASP.NET Starter Kits.

(.NET - ASP.NET) Improved Caching in ASP.NET 2.0

Stephen Walther looks at the new caching features included in ASP.NET 2.0, and how you can use them to improve the performance and scalability of your ASP.NET applications.

The most dramatic way to improve the performance of a database driven Web application is through caching. Retrieving data from a database is one of the slowest operations that you can perform in a Web site. If, however, you can cache the database data in memory, then you can avoid accessing the database with every page request, and dramatically increase the performance of your application.

The one and only drawback to caching is the problem of stale data. If you cache the contents of a database table in memory, and the records in the underlying database table change, then your Web application will display old, inaccurate data. For certain types of data you might not care if the data being displayed is slightly out of date, but for other types of data—such as stock prices and auction bids—displaying data that is even slightly stale is unacceptable.

The initial release of the Microsoft ASP.NET framework did not provide a good solution to this problem. When using the ASP.NET 1.0 framework, you just had to live with this tradeoff between performance and stale data. Fortunately, the Microsoft ASP.NET 2.0 framework includes a new feature called SQL Cache Invalidation that solves this very problem.

In this article, you'll learn about many of the new caching enhancements in the ASP.NET 2.0 framework. First, you'll learn how caching support has been integrated into the new DataSource controls. Next, you'll learn how to configure and take advantage of SQL Cache Invalidation. Finally, we'll take a look at a new control introduced with the ASP.NET 2.0 framework, which enables you to inject dynamic content into a cached page: the Substitution control.

(.NET - ASP.NET) Changes to the Validation Controls in ASP.NET 2.0

While ASP.NET 1.x supported validating user input, ASP.NET 2.0 increases the flexibility of the validation through the addition of validation groups. This article looks at this new feature, and shows you how you can use it in a number of common scenarios.

ASP.NET 2.0 includes a new feature which enables you to group form controls into distinct validation groups. This article discusses how you can take advantage of validation groups in complex form validation scenarios.

Two common validation scenarios are discussed in this article. First, you learn how to take advantage of validation groups when adding a search box to a Web page. Next, we'll discuss methods for using validation groups with databinding controls such as the GridView control.

(.NET - ASP.NET) New Security Features in ASP.NET 2.0

ASP.NET 2.0 builds on ASP.NET 1.x to enable you to more easily create and manage users, and to password-protect pages in a Web application. The new framework includes new features for working with authentication and authorization, which were designed to appeal to both Web site administrators and developers.

Web site administrators can take advantage of the new Web Site Administration Tool to create new users and roles, and to control access to pages in a Web application. The Web Site Administration Tool is a set of prewritten ASP.NET pages that can be used by individuals with no programming skills to configure a Web application.

Developers can take advantage of the new Login controls in order to quickly build security related pages in a Web application. For example, a developer can create a login page simply by dragging a Login control onto an .aspx page. By taking advantage of the Login controls, a developer can build a login page, a registration page, or a password recovery page without writing any code.

Finally, the ASP.NET 2.0 framework contains new security related features which will appeal to advanced developers. The new Membership API is a set of classes that contains methods for creating and retrieving information about application users. In addition, the new framework contains classes that make it easier to work with custom user roles.

(.NET - ASP.NET) Introducing the ASP.NET 2.0 GridView and DetailsView Controls

The ASP.NET 2.0 framework introduces two new controls for working with database data: the GridView control and the DetailsView control. You use the GridView control when working with a set of database records. The DetailsView control is used when working with individual records.

The GridView control is the successor to the DataGrid control. While Microsoft ASP.NET 2.0 still includes the DetailsGrid control, you are encouraged to take advantage of the new features of the GridView control.

The GridView control enables you to perform many of the same tasks as you would previously perform with the DetailsGrid control. The advantage of the GridView control is that, in many cases, you can perform these tasks without writing any code. The GridView control enables you to:

Display a set of database records.
Sort a set of database records.
Page through a set of database records.
Edit a set of database records.
In addition, unlike the DetailsGrid control, the GridView control enables you to sort and page through database records without requiring a postback to the server. The GridView control, optionally, uses client-side script to enable you to sort and page database records without performing a form post.

The DetailsView control is an entirely new control introduced with ASP.NET 2.0 that enables you to work with individual database records. You can use the DetailsView control on its own, to display or edit a single database record. When used in conjunction with the GridView control, you can use the DetailsView control to quickly build master/detail forms.

Wednesday, July 07, 2004

(Tidbits) Quality in Life

My wife works for a CMM level5 organization.We woke up at 6 AM this morning and things got off to a pretty rough start right away. She asked me "What time is it dear?". I told her it was 5 AM on 5/05/ 2005 - to which she looked terribly worried.

"Why?? What is wrong?"

"You see, yesterday we decided to wake up at 5.30 AM and it is only 5 AM now!!"

"So what?" I asked very innocently.

"What kind of an organization do you work for? Quality is in my blood. People working in a CMM level 5 organization will tell you what implications this can have on our day and our life!!. We need to do a CAUSAL ANALYSIS for this blunder right away".

I tried to take the 'knowledge-free common-sense approach' and told her "The reason is very obvious. Power supply went off by 6 AM and the mosquitoes and the Chennai heat / moisture woke us up.. hee.. hee"

"Shut up" she said. "You people are always looking for an excuse, putting the blame on others. Why don't you take some OWNERSHIP and do something about it? OK. The power went off. Did you at least call the electricity board?"

Before I could ask what she had done about it, she shot off to the kitchen to make coffee. I went in to the kitchen with a request "Dear, I know we had decided that today was my turn to cook, but my bike has not been washed for the last 2 weeks as I was busy entertaining guests for the marriage.

Can you please cook today as I get my bike cleaned?". She gave a very considerate _expression and said "No probs! Just raise a CHANGE REQUEST in the IHMS and carry on".

"IHMS?? What IHMS ?" I asked. "Oh! You do not know this stuff. I have to teach you everything ... People working in our organization use an 'Integrated Home Management System' to organize our personal lives. You can see it in the PC in the living room. It is pretty straight forward. Even YOU can use it?

BTW, it is even web enabled and you can do this work from your office too".

I had had enough. I simply did what she said without uttering a word. As I sat to have breakfast, she brought some Idly with all the love in the world. I was pleased. "My marriage will not break after all" I thought to myself as I tasted it ... and.. it tasted terrible. I almost broke my teeth trying to take a bite... and asked her "You call this Idly?"

She put this 'I know it all' look and said "I know you people raise such issues. That is why I have a DOCUMENTED PROCEDURE to make idlys. Look .. it has even been reviewed and approved by YOUR Mom... and I have once again documented everything I did this morning... even the quantity of salt I added.. can you find anything wrong with this?" she asked... showing off all her documentation.

"I certainly cannot find anything wrong with your documentation, but can find a lot wrongs in the Idly you have made" I thought to myself and headed for work.

Hardly had I switched on my machine at work when I had to pick up the phone. "I am your wife calling and I have some big news for you". 'Not again' I thought.

"CMM people have come up with a new level. It is called level 6. Our company is planning to be level 6 certified by 06/06/2006. How great .....right??"

"Look my dear wife!" I told her. "I have news for you too. RMG has allocated me to an undisclosed project in an undisclosed location for .....err... indefinite duration..!"

"What??..."

"The only information I could gather was that I have to travel ALONE.."

Tuesday, July 06, 2004

(.NET - VB.NET) Visual Basic Developer Center: Introducing the Visual Basic .NET Power Pack

Visual Basic Developer Center: Introducing the Visual Basic .NET Power Pack

This article discusses the Visual Basic Power Pack, a collection of custom controls that provides enhanced user interface elements to client based applications.

The Visual Basic Power Pack provides a collection of custom controls that you can use to add an extra element of visual appeal to your client-based applications. The controls are designed to be fairly simple to use, both at design-time and runtime. And since the controls ship with the source code, you can extend or change the behavior of the controls as you see fit.

Monday, July 05, 2004

(.NET) .NET Tools: Ten Must-Have Tools Every Developer Should Download Now -- MSDN Magazine, July 2004

.NET Tools: Ten Must-Have Tools Every Developer Should Download Now -- MSDN Magazine, July 2004

This article provides a view on some the tools avaialble fr integration with Dotnet and also the urls from where they are available.

(.NET - ASP.NET) GridView: Move Over DataGrid, There's a New Grid in Town! -- MSDN Magazine, August 2004

An article from MSDN Magazine, it explains the different views that are available in ASP.NET, the difference between Datagrid and GRid View, the programming interfaces of these controls and other details.

Despite the richness and versatility of its programming interface, the ASP.NET 1.x DataGrid control requires you to write a lot of custom code to handle common operations such as paging, sorting, editing, and deleting data. For example, while the DataGrid control can raise events when the user clicks to save or cancel changes, it doesn't offer much more than that. If you want to store changes to a persistent medium, such as a database, you have to handle the UpdateCommand event yourself, retrieve changed values, prepare a SQL command, and then proceed from there to commit the update.
The reason the DataGrid control limits the raising of events for common data operations is that it's a data source-agnostic control that can be bound to any data object that is enumerable. Implementing data operations such as update or delete would require a direct link with one particular data source. In ASP.NET 1.x, you work around this limitation by writing ADO.NET code that is specific to your application.
ASP.NET 2.0 enhances the data-binding architecture, introducing a new family of components—the data source objects—which act as a bridge between data-bound controls and ADO.NET objects. These source objects promote a slightly different programming model and provide for new features and members. For data reporting purposes, your ASP.NET 2.0 applications should use the newest grid control® the GridView. The familiar DataGrid control is still supported, but it doesn't take full advantage of the specific capabilities of data source components.
The GridView control is the successor to the DataGrid and extends it in a number of ways. First, it fully supports data source components and can automatically handle data operations, such as paging, sorting, and editing, provided its bound data source object supports these capabilities. In addition, the GridView control offers some functional improvements over the DataGrid. In particular, it supports multiple primary key fields and exposes some user interface enhancements and a new model for handling and canceling events.
The GridView comes with a pair of complementary view controls: DetailsView and FormView. By combining these controls, you can easily set up master/detail views using very little code and sometimes no code at all.

Sunday, July 04, 2004

(.NET - VB.NET) Visual Basic: Community

Visual Basic: Community

The New Community page for VB which lists the different resources and tools available for VB at different locations , other than Microsoft.

(.NET) Adding Preferential and Random Sorting to MapPoint Web Service Applications

With Microsoft MapPoint Web Service 3.5, you can now sort search results based on the values of specific entity properties. The new SortProperties property satisfies most locator scenarios for sorting search results. Some locator applications, however, such as an insurance agent locator or a hotel finder, require advanced sorting to give preference to particular agents or hotels. These applications may also require that a portion of the search results be presented in random order, so that no preference is given to any result.

This article first describes how to use the SortProperties property, and then illustrates how to do preferential and random sorting, using a hotel-finder application for a travel Web site as an example.

(.NET Architecture Center Home) Dealing with Concurrency: Designing Interaction Between Services and Their Agents (Building Distributed Applications)

Maarten Mullender
Microsoft Corporation


Create and work with software services and service interactions using design principles that will give you several ways to deal with the challenges of keeping data valid when consumers retrieve and work with the data across a network, while the service carries on with other work.

Introduction
In this article I will discuss some of the challenges created by working with services. I define software services as discrete units of application logic that expose message-based interfaces suitable for being accessed across a network. Consumers (which can be client applications or other services) retrieve data from services and work with that data while the service carries on with other work, thus possibly invalidating that data. I will highlight some of the design principles that you can use to deal with such challenges.

I will not try to provide guidance on building offline applications, nor on preparing the client for offline use by pre-populating the local cache. Rather, I will concentrate on designing service interactions.

The Service Model
Typically, services provide both the business logic and the state management relevant to the problem they are designed to solve. When designing services, the goal is to effectively encapsulate the logic and data associated with real-world processes, while making intelligent choices about what to include and what to implement as separate services.

Services are necessarily very protective of the state that they manage, taking great care to authorize both read and write access, and to validate updates against integrity rules. Services are strongholds for the state they manage and are the definitive authorities on how to manipulate that state. They don't allow direct access to their data, nor will they expose their complete internal state. Instead, they provide copies of the data that they maintain. Services may be said to maintain a "healthy distrust" of outsiders seeking access.

(.NET - ADO.NET) Data Access and Storage Developer Center: ADO.NET 2.0 Feature Matrix

ADO.NET 2.0 includes a new base-class provider model, features for all providers, and changes to System.Data.SqlClient. Get an overview of these new features, examples of their use, and a chart of which features are provider-neutral and SqlClient-specific. (14 printed pages)

Contents
The Base-Class-Based Provider Model
Connection Pooling Enhancements
Asynchronous Commands
Bulk Import
Provider Statistics
AttachDbFileName
SQL Server 2005-Specific Features in SqlClient
Conclusion

ADO.NET 2.0 comes with a plethora of new features. This includes a new base-class–based provider model and features that all providers can take advantage of, as well as changes that are specific to System.Data.SqlClient. Because the .NET Framework 2.0 is being released in conjunction with SQL Server 2005, some of these features require SQL Server 2005 to be usable. This article is meant to serve as an overview and roadmap of the new features, give examples of their use, and includes a chart of which features are provider-neutral and which are SqlClient-specific. In future articles in this series, I'll be going over some of the features in greater detail. In addition, there are many new features of the DataSet and friends; these will be covered in future articles.

(.NET - VB.NET) How Long Now?

Duncan Mackenzie describes how to calculate the difference between two dates in Visual Basic .NET, and builds an application that counts down to the release of Halo 2.

Introduction
In the past few days on GotDotNet's forums, I've seen the same question come up in at least 5 different ways:

"How do I figure out how many days, weeks, hours, or minutes there are between two dates?"
People have been pretty quick to answer each question with a snippet of source code, usually showing how to use the TimeSpan class or the DateDiff function, but I thought I'd try to discuss the matter in a more general fashion. In this article, I'm going to first quickly run down the ways you can determine the difference between two dates. Then I'll walk through the creation of a simple "countdown" application (see Figure 1) that you can run on your desktop to give you a running update of the time remaining until a specific event occurs.

(.NET - VB.NET) Operator Overloading in Visual Basic 2005

Operator Overloading in Visual Basic 2005

Operator Overloading is a common feature that is seen in Object Oriented Languages. Though VB was made to be an OOL, the concept of Operator Overloading is new to Visual Basic 2005. It Simplifies the use and development of complex types by allowing you to specify your own implementation for standard operations such as addition and subtraction.

This Article from MSDN explains the concept in detail with examples.

Thursday, July 01, 2004

(Tidbits) TakingITGlobal

This is one place where I meet many different indivuduals, each with a persona of their own. I learn a lot here and made a whole bunch of new friends.

This is a major community with people from more tha 200 countries taking an active participation in different activties all over the world.

A must for all those who are intrested in developing new friendships and knowing people over the world