When Azure was first announced at the PDC in 2008, Microsoft wasn’t a recognized player in the cloud industry. It was the underdog to the giants Google and Amazon, which had been offering cloud services for years by that time. Building and deploying Azure was a big bet for Microsoft. It was a major change in the company’s direction, from where Microsoft had been and where it needed to go in the future.
Up until that time, Microsoft had been a product company. It designed and built a product, burnt it to CD, and sold it to customers. Over time, the product was enhanced, but the product was installed and operated in the client’s environment.The trick was to build the right product at the right time, for the right market. With the addition of Ray Ozzie to the Microsoft culture, there was a giant shift toward services. Microsoft wasn’t abandoning the selling of products, but it was expandingits expertise and portfolio to offer its products as services. Every product team at Microsoft was asked if what they were doing could be enhanced and extended with services. They wanted to do much more than just put Exchange in a data center and rent it to customers. This became a fundamental Continue Reading

Since the growth of the web distributed models have become the de facto standard of the corporate software industry. N -tiered models allowed for the simple decoupling of layers both physically and conceptually. This is so because of the frameworks available that supports their individual technical purposes, as well as because of the lower cost of hardware needed to house these systems. The web model took a firm architectural hold in the early 2000s. Even systems requiring or preferring the use of thick client installations shifted toward an application server to handle business rules external to the database. Yet as these distributed models grew, another technical dilemma surfaced: interoperability.

The world was smaller now, and systems were being developed with communications in mind. Systems were expected to transmit data between servers and PCs to perform even the simplest of business tasks. Businesses increased the number of applications they built to support business needs, which now needed to communicate with each other to share logic and services. Even the exchange of data had now complicated a bit as each application required a number of dedicated data servers for redundancy and high availability. Which server was the right one to use? How would a process running on server X know how to access logic from a class on server Y? The answer was the creation of enterprise middleware.

Enterprise middleware can be considered any model Continue Reading

Integrating WCF with WF

Posted: May 2, 2011 in 4.0, DotNet, Microsoft, WCF, WF

Workflows are presented in software development as an effective tool for solving problems or accomplishing business goals. By splitting them into smaller and more manageable pieces, they can be coordinated into a single process. Workflows also bring reusability and maintainability — two core quality principles in software engineering. They are promoted by workflows because they have smaller pieces that can be reused across several processes and are easier to maintain. These pieces or reusable assets receive the name of activities, and usually describe the work performed by the people or software involved in the process. When designing workflows within an enterprise application using a bottom – up approach, they are first designed as a set of primitive activities to control the execution flow, communicate with the outside world, or execute actions to change the workflow’s internal state. These workflows can later be converted to activities and used as a compositional unit for creating other workflows that coordinate work at a higher level. In this way, the design process can be applied recursively to a number of abstraction levels that we might find necessary.
The same design paradigm can be applied in the service – oriented world. A service can be implemented through  Continue Reading

The .NET Platform

Posted: May 1, 2011 in DotNet, Microsoft

The .NET platform is Microsoft’s development platform for Windows and the cloud. This includes support for desktop applications, smart device applications, server and database applications, web applications and services, and cloud applications deployed to Windows Azure.
In addition, the technology is available in more exotic corners of computer science, ranging from operating system implementation (for example, Microsoft Research’s Singularity research operating system) to gaming with the XNA framework and Microsoft Robotics Studio for robotics. Figure below gives an overview of the .NET platform

Let’s point out the key features and advantages Continue Reading >

Business Logic Layer

The business logic layer is an abstract layer that contains all the rules, workflow and validation that define and deal with the business complexities that your software has been designed to meet. Typically, the business logic layer will fit between your user interface, service, or presentation layer and your data access layer. By separating the business logic from other layers in your applications you are separating your concerns and keeping your code loosely coupled and allowing for different implementations to be used with the business logic layer; for example, changing the UI from a web application to a WPF application.

WPF Win Forms — Web Forms — Other Services
——————————————————————-
Service Layer
——————————————————————-
Business Logic Layer
——————————————————————-
Data Access Layer
——————————————————————-
Data Sources

Depending on the complexities of the business model that your application has been designed for, the structure of your business logic layer may differ considerably. For instance a complex banking application will have a rich Domain Model that represents the real banking domain with hundreds of small business entities representing loans, customers, and accounts. For a simpler application, such as a blogging engine, you may have a number of fairly simple business objects that map very closely to your underlying Data Model with little or no business logic, simply acting as a method to add and retrieve data. In the next section, you will look at the various patterns at your disposal to structure your business logic layer.

Patterns for Your Business

There are a number of design patterns that you can follow that will help you to organize your business logic. In the below part, you will be introduced to three of the main patterns found in the business logic layer: the Transaction Script, Active Record, and the Domain Model pattern.

Transaction Script

Transaction Script is a very simple business design pattern, which follows a procedural style of development rather than an object – oriented approach. Simply create a single procedure for each of your business transactions with each procedure containing all of the business logic that is required to complete the business transaction from the workflow, business rules, and validation checks to persistence in the database.
One of the strengths of the Transaction Script pattern is that it is very simple to understand and clear to see what is going on without any prior knowledge of the pattern. If a new business case needs to be handled, it is straightforward Continue Reading >

Preprocessor Directives C# 4.0

Posted: April 15, 2011 in DotNet
What Are Preprocessor Directives?
The source code specifies the definition of a program. The preprocessor directives instruct the compiler how to treat the source code. For example, under certain conditions, you might want the compiler to ignore portions of the code, and under other conditions, you might want that code compiled. The preprocessor directives give you those options and several others.

In C and C++ there is an actual preprocessor phase, in which the preprocessor goes through the source code and prepares an output stream of text to be processed by the subsequent compilation phase. In C# there is no actual preprocessor. The “preprocessor” directives are handled by the compiler. The term, however, remains.
Preprocessor List:

#define identifier: Defines a compilation symbol

#undef identifier: Undefines a compilation symbol

#if expression: If the expression is true, compiles the following section

#elif expression: If the expression is true, compiles the following section

#else: If the previous #if or #elif expression is false, compiles the following section

#endif: Marks the end of an #if construct

#region name: Marks the beginning of a region of code; has no compilation effect

#endregion name: Marks the end of a region of code; has no compilation effect

#warning message: Displays a compile-time warning message

#error message: Displays a compile-time error message

#line indicator Changes: the line numbers displayed in compiler messages

#pragma text: Specifies information about the program context

The #define and #undef Directives

A compilation symbol is an identifier that has only two possible states. It is either defined or undefined. A compilation symbol has the following characteristics:  Continue Reading >

Business intelligence (BI) refers to computer-based techniques used in identifying, extracting, and analyzing business data, such as sales revenue by products and/or departments, or by associated costs and incomes.
BI technologies provide historical, current and predictive views of business operations. Common functions of business intelligence technologies are reporting, online analytical processing, analytics, data mining, business performance management, benchmarking, and text mining and predictive analytics.
Business intelligence aims to support better business decision-making. Thus a BI system can be called a decision support system (DSS).Though the term business intelligence is sometimes used as a synonym for competitive intelligence, because they both support decision making, BI uses technologies, processes, and applications to analyze mostly internal, structured data and business processes while competitive intelligence gathers, analyzes and disseminates information with a topical focus on company competitors. Business intelligence understood broadly can includethe subset of competitive intelligence (previous definition from Wikipedia)

Business Challenges:

-Data is stored in a number of different systems on different platforms, such as inventory and logistics in SAP, financials in Oracle, web analytics in SQL Server, and manufacturing on the mainframe. This can make data difficult to get to, require multiple accounts for access, and keep teams and departments at arm’s length from each other.

-Pockets of knowledge about the data are spread throughout teams that don’t regularly interact. This spread causes data to be analyzed in different ways and metrics to be calculated inconsistently, which, in turn, leads to unpredictable analysis and inappropriate actions being taken based on the data.

-Documentation is limited or nonexistent. Many times documentation is not created for reports or the metadata underneath them, and this lack of documentation is a critical problem. If you don’t know where the data is coming from for the report, or how certain metrics are being calculated, then you can’t truly understand or communicate the value of the data and calculations.

-Consolidated reporting is very time-consuming, when it is possible at all. With reports coming from so many different places, you run into the same problems mentioned in the previous point. These challenges require more people to know different reporting systems, lead to more administrative headaches, and so on.

-Reporting teams spend significant time finding and aligning data instead of analyzing and mining it for actionable information. If reporting teams need to go out and gather data from across the company constantly, this doesn’t leave much time for analyzing and interpreting the data. These challenges cause many reporting teams to rework large portions of their reports several times as opposed to spending that time understanding what the users are asking for and delivering more actionable information.

How Intelligent Is Your Organization?

Business Intelligence (BI) is a term that encompasses the process of getting your data out of the disparate systems and into a unified model, so you can use the tools in the Microsoft BI stack to analyze, report, and mine the data. Once you organize your company’s data properly, you can begin to find information that will help you make actionable reports and decisions based on how the data from across your organization lines up. For instance, you can answer questions like, “How do delays in my manufacturing or distribution affect my sales and customer confidence?” Answers like this come from aligning logistics data with sales and marketing data, which, without a Business Intelligence solution, would require you to spend time exporting data from several systems and combining it into some form that you could consume with Excel, or another reporting tool.

Business Intelligence systems take this repetitive activity out of your life. BI automates the extracting, transforming, and loading (ETL) process and puts the data in a dimensional model that sets you up to be able to use cutting-edge techniques and everyday tools like Microsoft Excel to analyze, report on, and deliver results from your data.

Getting Intelligence from Data

How do you get information from data? First, you need to understand the difference. As you learned earlier, data can come from many different places, but information requires context and provides the basis for action and decision-making. Identifying your data, transforming it, and using the tools and techniques you learn will enable you to provide actionable information out of the mass of data your organization stores. There are several ways to transform your data into actionable information and each has its pros and cons.
Typical solutions for reporting include a few different architectures:

Departmental reporting: Many organizations have their own departmental reporting environments. This situation leads to a significant increase in licensing costs, since using different vendors for each department and reporting environment increases spending on hardware and software licensing, end-user training, and ramp-up time.

Individual data access: Some organizations find it easier to grant lots of individual user’s access to the data. This is not only dangerous from a security perspective, but likely to lead to performance problems, because users are not the most adept at creating their own queries in code. Also, with all the industry and federal compliance and regulation governing data access, widespread access can quickly lead to a security audit failure, especially in a publicly held company.

BI add-on from each vendor: When teams seek out and apply different strategies, it exacerbates the original problem of data being all over the organization. The data will still be segmented, and additionally the analysis on it will be inconsistent and applied based on each team’s individual understanding of how its data fits into the enterprise, instead of the correct view based on the organizational goals.

Automated reports from different systems: It may be nice to get the automated reports and they likely serve a purpose, but they usually cannot be counted on to run an enterprise.
Strategic reporting, dashboard drill-through, and detailed analysis require a BI implementation to support them and provide the “at your fingertips” data and analysis that your end users, managers, and executives are craving.

You have likely seen some form of all of these problems in your organization. These are the opposite of what you want to accomplish with a great BI infrastructure.

BI to the Rescue

A well-thought-out BI strategy will mitigate the problems inherent to each of the previously listed approaches. A good BI approach should provide the targeted departmental reporting that is required by those end users while adjusting the data so it can be consumed by executives through a consolidated set of reports, ad hoc analysis using Excel, or a SharePoint dashboard. Business Intelligence provides a combination of automated reporting, dashboard capabilities, and ad hoc capabilities that will propel your organization forward.

BI provides a single source of truth that can make meetings and discussions immediately more productive.
How many times have you gotten a spreadsheet via e‑mail before a meeting and shown up to find that everyone had his or her own version of the spreadsheet with different numbers? Business Intelligence standardizes organizational calculations, while still giving you the flexibility to add your own and enhance the company standard. These capabilities allow everyone to speak the same language when it comes to company metrics and to the way the data should be measured across the enterprise or department.
Integrating Business Intelligence with your organization’s current reporting strategy will improve the quality of the data as well as the accuracy of the analysis and the speed at which you can perform it. Using a combination of a data warehouse and BI analytics from Analysis Services and Excel, you can also perform in-depth data mining against your data. This enables you to utilize forecasting, data-cluster analysis, fraud detection, and other great approaches to analyze and forecast actions. Data mining is incredibly useful for things like analyzing sales trends, detecting credit fraud, and filling in empty values based on historical analysis. This powerful capability is delivered right through Excel, using Analysis Services for the back-end modeling and mining engine.

BI = Business Investment

A focused Business Intelligence plan can streamline the costs of reporting and business analytics. The Microsoft BI stack does a great job of providing you with the entire tool set for success within SQL Server Enterprise Edition. We provide more details on that shortly, but the most important bit of information you should take away right now is that the cost of managing multiple products and versions of reporting solutions to meet departmental needs is always higher than the cost of a cohesive strategy that employs one effective licensing policy from a single vendor. When organizations cannot agree or get together on their data strategy, you need to bring them together for the good of the organization. In the authors’ experience, this single, cohesive approach to reporting is often a gateway to a successful BI implementation. Realizing the 360-degree value of that approach and seeing the value it can have in your organization are the two most important first steps.

Microsoft’s Business Intelligence Stack

Microsoft’s Business Intelligence stack comes with SQL Server and is greatly enhanced with the addition of SharePoint Server.
We will discuss the SQL Server Integration Services (SSIS) in upcoming articles, mean while you have the topic and you can do research more and deep about it. Stay tuned…