Saturday, February 28, 2009

Using EXEC() AT with Oracle DB

My previous blog was about using OPENQUERY() method with an Oracle DB. The advantage of using this method call is to pass through a query to a linked server such as Oracle server and to leverage the Oracle server to do the job. The great benefit of this strategy is to obtain much fast performance compared to using a T-SQL query directly against an Oracle database object locally at SQL server.

However, when I tried OPENQUERY next day. I was so disappointed. What I found is that the query passed to OPENQUERY() must be a literal string, no variable or expression is supported. That means I cannot build a query string dynamically with different constrains. This greatly limits the usage of OPENQUERY. I just cannot use it in my projects.

Fortunately, I found another pass-through way to leverage Oracle's power to query data, and it is possible to pass result back. I spent about one and half day to figure it out. I posted several questions to StackOverflow web site but I did not get any answer there. Eventually I got the solution through Google searching. I posed my answer to StackOverflow as well.

The method is to use "EXEC (...) AT ..." to the job:

EXEC (@sql) AT linedOracleServer;

Actually, there are some good examples of using this method. However, all those examples are using either Microsoft SQL server or Express SQL server. By using pass-through query, the syntax of the query is very different from Microsoft SQL server and Oracle SQL server. That's why it took me a while to figure it out.

I can build an Oracle SQL query to combine all the inputs into a query string. That's an easy part. The most challenge issue is to pass results as output parameters. For example, I need to get total count of rows in an Oracle table with WHERE clause, and pass the result out to my SQL server.

Here is what I did:

-- Initialize setting
SET @myCount = NULL;
-- Build sql query
SET @sql = N'
FROM owner.myTable
WHERE id = '
+ CAST(@id AS VARCHAR) + N';
EXEC (@sql, @myCount OUTPUT) AT linedOracleServer;
IF (@myCount IS NOT NULL)
'I get the count from my Oracle table: ' +
'It must be an query syntax error';

The key points to build an Oracle query with an output parameter is to follow these steps:
  • making sure the query being a valid Oracle PL/SQL query such as SELECT ... INTO ... to assign result to a variable or parameter;
  • using ":" before a parameter name;
  • using an anonymous block (BEGIN...END) to wrap the query; and
  • Appending ";" at the "END" block.

One lesson I learned from this practice is that exceptions raised by "EXE () AT" may provide misleading error messages. For example, when I used "@" for parameter or missed ";" after END. I got somethink like "illegal characters in expression". You have to open your view in all different angles.


Wednesday, February 25, 2009

Using OPENQUERY for Oracle DB

Today I googled an alternative solution to run an Oracle PL/SQL query or stored procedure from Microsoft SQL server 2005. I am very impressed by the result of this alternative way.

What I mean by an alternative way is that I have been using T-SQL queries directly against linked servers defined in Microsoft SQL server. A linked server is a server added to Server Objects->Linked Servers by using Microsoft SQL Sever Management Studio tool.

The linked server that Microsoft SQL server supports covers a variety of database servers, including Oracle server. By using T-SQL queries, you can easily query data from to any database object such as a table or view on this server in the same way as querying data from a local table or view, as in the following example:

FROM myOracleServer..owner.tableName
WHERE ...;

where myOracleServer could be a linked server to an Oracle server.

One problem I troubles me is that when a table on Oracle side is very big(with millions of rows of data), the query may take a while to execute, sometimes more than 1 minute just for a query call. I had many cases to take more than 10 hours to update data from my Microsoft SQL server to an Oracle server with thousands of T-SQL calls(checking, updating and inserting data).

The alternative way I found today is to use OPENQUERY call like this:

FROM OPENQUERY(myOracleServer,
N'SELECT COUNT(*) AS MyCount FROM owner.tableName WHERE ...');

The execution time is 00:00:00 comparing 00:01:09 with the similar direct T-SQL query as above. It was a stunning when I saw the result back right away.

The information I found is from a discussion on Bytes web forum, where one person responded a recommendation by using OPENQUERY in a Google Groups' discussion. Then I figured out the solution that I need to improve my current T-SQLs.

The reason for such a big difference, I think o as my understanding, is that the SQL server may create a temporary table or allocate a cache for the database object referenced by a linked server in a T-SQL' query, while OPENQUERY is just a OLE DB connection call to the linked server with a SQL query by which the server or Oracle does its job with its full power.


Sunday, February 22, 2009

T-SQL Tip: Combine Column Names as a String

I found following T-SQL query to build a string of selected column names together separated by command for a table. This string can be used then for another SQL query as a column list.

The first thing what I did is to use Coaslesce function to combine a list of row values into a string, separated by ',' if a value is not null or '' if null.

The second trick is to use information_schema.colums to get column names by column_name. Notice that table_name and column_name are a special key words here for table and column names.

1    DECLARE @v_FiledList NVARCHAR(MAX);
2 -- Use Coalesce to combine rows to a string
3 SELECT @v_FieldList = COALESCE(@v_FieldList + ',','') + column_name
4 FROM (
5 -- Use information_schema to get table's column names
6 SELECT column_name FROM information_schema.columns
7 WHERE table_name = 'myTable' AND
8 column_name NOT LIKE 'keyCol' -- filter col 'keyCol' out
9 ) AS A;
10 PRINT 'Field list: ' + @v_FieldList;

You can get more detail information such as data type, length, and so on from information_schema.columns:
SELECT column_name, * 
FROM information_schema.columns
WHERE table_name = 'myTable'


Tuesday, February 17, 2009

jQuery Tutorial Videos

Today Ajaxian posted a group of tutorial videos on jQuery. I just finished the first one by John Resig, the designer and developer of jQuery. I saw a number of his training videos. The first one by Ajaxian's blog is for entry level:

Not sure what link is used for this embedded video, Adobe flash? The quality of the video is quite good, comparing to similar ones on

By the way, the editor John used for the video is SubEthaEdit, an interesting editor with multi-user to work on one text file.


Thursday, February 12, 2009

Vimperator Tip of the Day: xall Command

Tip of the DayIn my Firefox Preference, I have set up my tab settings to save all the tab as a session when it is closed and reopen the session when it starts.

However, sometimes this feature does not work. I get a blank tab occasionally and lost my previous tabs. I don't know why.

Today I found a Vimperator command to resolve the issue: xall command. Just type :x and then tab, this command is available for use. The description for this command is:

    Save the session and quit
or a snap-short:

Just remember to use this command close my Firefox from now on. I have add this key to My favorite Vimperator Keys.

The Power of Muscle Memory with Vimperator!


Tuesday, February 10, 2009

Command Line Tool: curl

toolsToday I watched a video on Atom Publishing Protocol at GoogleDevelopers. Further reading on this topic, I found an article on this by IBM: Getting to know the Atom Publishing Protocol Part 1.

One thing that got my attention is the command line tool: curl. In my Mac terminal, I found this one is available. That's Unix world benefit. According to the manual information of curl(man curl):

curl is a tool to transfer data from or to server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, SCP, SETP, TFTP, DICT, TELNET, LDAP, or FILE). The command is designed to work without user interaction.

That's a great tool. I like this one so that I can interact with a web server in command line. For example I tried this command to get a feed from a server:

curl -s -X GET

Immediately I got returned strings back in the terminal. I tried to use Fixfox to open the feed but I could not GET result back in browser. Firefox prompted me to add it to my bookmark. By using curl tool, I can get the result back for analysis. It's a great tool to test and verify Atom Publishing Server or any web servers.

I don't think Windows cmd terminal has this one available. It might be something similar in PowerShell.


Saturday, February 07, 2009

Caching Data and IFactory

Recently I have been working a project to extract data from a huge database server for a daily reporting service. The process generates data in a required format based a complicated business logic.

The structure of this project contains a Repository service as a gateway to provide interfaces to get domain objects by Domain classes. To retrieve data, I use IFactory pattern to get data:

interface IFactory<T> 
T CreateObject(IDBService);

where IDBService is an interface to provide methods to get data from database by using SQL.

The only problem with this strategy is that the process is too slow. It constantly makes SQL queries from the database, more than thousands SQL query calls.

I need a way to cache data to reduce those SQL calls at Repository. Since the data amount is none-predictable, I have to limit the cache. I use a maximum cache counter in the application configuration, 10000 as example. If the count of required data is less than this limit, then the Repository will cache all the data for later use. Otherwise, I have to get one data at a time.

The more I get into this caching feature, the better the process performs. With a maximum caching number, I reduced the SQL calls down to 8, and reduced the time from hours to minutes or less than 1 minute! That's great improvement.

For example, I created two factory classes for retrieving data:

public class SpectInfoListFac<List;<SpectInfo>> :
List<SpectInfo> CreateObject(IDBSerivce db) {...}

public class SpectInfoFac<SpectInfo> :
SpectInfo CreateObject(IDBSerivce db) {...}

For caching data, I used the first factory to get a list of data back if not too much data. The constructor of the factory provides information about the data and date range, as well as the maximum count number. If too much data or over the limit, the factory will return a null and I'll use the second factory class to get a specific data.

However,how about different levels of caching? Taking another finding people as example, if I want to find people with the name like "David Chu" and male from Canada, I may get too many. I could try with one more condition province = "AB", then city = "Calgary", then district area in "NW"... This will result in too many factory classes. Can I combine all these into one?

Here I changed the IFactory interface to a more generic way:

interface IFactory<T> 
IEnumerable<T> CreateObject(IDBService);

public class SpectInfoFac<SpectInfo> :
IEnumerable<SpectInfo> CreateObject(IDBSerivce db, int level) {...}

There is only one method to create or retrieve objects. The level parameter is used for the granularity of data range. It will be up to the implementation Factory class to decide the level. For example, 0 for all, 1 for AB, 2 for Calgary...

The result is an enumerable collection. It could be a collection more than one, one for specified data item, or null for nothing being found.

Still I can have options with various implementations of IFactory class or one implementation to cover all the cases. My simplified factory pattern provides a caching option.

I love to re-factory of my codes!


Sunday, February 01, 2009

Vimperator Tip of the Day: Show Sidebar Window

If you have not set your wideoptions to auto in your .vimperatorrc(for Mac) or _vimperatorrc(Windows), you may type the command to enable command auto-completion. Read details in my WordPress blog ....