Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Derek Weeks, Pat Romanski, Carmen Gonzalez

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Microsoft Cloud, Open Source Cloud, Agile Computing

@BigDataExpo: Article

An Alternative to Stored Procedure in Big Data Computation

Easier than stored procedure

As we know, the stored procedure is designed to handle computations involving complex business logics.

In the past, the data structure and business logics were so simple that one SQL statement was enough to achieve user's computational goal. With the rapid growing of information industry, users frequently find that they need to achieve the increasingly complex computational goals to out-perform their competitors. To address such computations, SQL alone is far from enough. Database programmers have the additional demands regarding the judge and loop statements, branches at multiple levels, or more accurate data transverse operations, as well as decomposing an obscure goal into several clear and actionable steps correlated with complex logics. It was for all these demands that the stored procedure was introduced.

The stored procedure has so far become the preferred tools for complex data computations, playing an quite important role. However, stored procedures still cause various inconveniences. For example, lots of their functions are inconvenient - hard to debug or migrate; and there are some databases with rather weak stored procedures. These problems sometimes affect the efficiency of database developers seriously.

The inconvenient functions of stored procedure include the incomplete step-by-step computation, weak support for set-lized data computing, no sequence number can be set for the data set, and no object-reference mechanism. Let's check it out with a simple example - find the "top n best-selling products in whatsoever State" by analyzing the regional sales table. In this scenario, those inconveniences makes it quite complex to write a stored procedure.

01 create or replace package salesPkg

02 as

03 type salesCur is ref cursor;

04 end;

05 CREATE OR REPLACE PROCEDURE topPro(io_cursor OUT salesPkg.salesCur)

06 is

07 varSql varchar2(2000);

08 tb_count integer;

09 BEGIN

10 select count(*) into tb_count from dba_tables where table_name='TOPPROTMP';

11 if tb_count=0 then

12 strCreate:='CREATE GLOBAL TEMPORARY TABLE TOPPROTMP (

stateTmp NUMBER not null,

productTmp varchar2(10)  not null,

amountTmp NUMBER not null

)

ON COMMIT PRESERVE ROWS';

13 execute immediate strCreate;

14 end if;

15 execute immediate 'truncate table TOPPROTMP';

16 insert into TOPPROTMP(stateTmp,productTmp,amountTmp)

select state,product,amount from stateSales a

where not(

(a.state,a.product) in (

select state,product from stateSales group by state,product having count(*) > 1

)

and rowid not in (

select min(rowid) from stateSales group by state,product having count(*)>1

)

)

order by state,product;

17 OPEN io_cursor for

18 SELECT productTmp FROM (

SELECT stateTmp,productTmp,amountTmp,rankorder

FROM (SELECT stateTmp,productTmp,amountTmp,RANK() OVER(PARTITION BY stateTmp ORDER BY amountTmp DESC) rankorder

FROM TOPPROTMP

)

WHERE rankorder<=10 order by stateTmp

)

GROUP BY productTmp

HAVING COUNT(*)=(SELECT COUNT(DISTINCT stateTmp ) FROM TOPPROTMP);

END;

In which, the code at line 16 is to filter the duplicates, and write the filtered data to the "temporary table". Since it is difficult to retrieve the distinct data directly, try this tip: find the duplicate data, then use "not" to reverse the condition, and the remaining is the distinct data. This function can be implemented by embedding two sub-queries.

For another example, the line 18 is to find the products ranking among the top 10 in whatsoever State. Firstly, use the window function to get the product rankings rankorder of each State; Secondly, filter out the top 10 products in each State; Lastly, get the products ranking among the top 10 in whatsoever State.SQL does not provide any functions to seek the intersection sets. So, to address this weak point, here is another tip: group by product so as to check if the number of a same products is equal to the number of States; if they are equal, it indicates that the product is ranking among the top 10 in every State.

Besides the inconvenient functions, the rather weak support for debug function is another inconvenience of stored procedures.

Although there are Oracle, DB2, and other databases offering debug function for their respective stored procedures in the market, their debug functions are incomplete. In running the stored procedure, no matter a SQL statement is long or short, no matter how many nested loops or the computational steps it contains, programmers can only view the result of one statement, and the intermediate procedure is completely transparent to them. It rather defeats the purpose of step-by-step debug, and compromising the benefits of running to cursor or the next step. The programmers are actually only allowed to view the cursor and the simple variables. Such variables are useful, but by no means the same important as the intermediate procedure of SQL. Another trouble is that lots of setting and preparation workloads are required to launch the debugging tools. The beginners can hardly handle it without guides.

The third inconvenience is that the stored procedure is hard to migrate. Generally speaking, SQL can be migrated with a few simple modifications. Despite the slight difference in syntax details, the SQLs from various vendors are all the supersets of the ANSI standard. However, the stored procedure is quite another thing. Migrating a stored procedure is much more complex than rewriting one because the relevant standards of various vendors differ greatly. In this situation, users have no choice but stick to one database vendor rigidly. There is not any room left for users to beat down the price if database vendors overcharge them on upgrading their servers, storages, and user license agreements.

SQL is an essential function of any database, while the stored procedure is not. Some databases only offer the relatively poor stored procedures, and the others do not provide it at all. Take the stored procedures of MySQL for example. Its functions and performances are worse than that of the MS SQL, Oracle and some other databases, and MySQL may throw many exceptions on intensive concurrency. MSSQL Compact, SQLITE, Hive, Access, and other databases do not support the stored procedures.

Obviously, the inconveniences of stored procedures have compromised the computational performance of database, and given much troubles to programmers - considerable development difficulty, inefficient development, and awkward maintenances. In addition, these inconveniences also affect the result of implementing business logics, achieving complex computational goal, and making smart business decision. Then, how to empower the stored procedure?

esProc is a database computing script specially built for addressing the complex computational goal. With the computational performance equally good and even better than that of stored procedures, esProc additionally provides the advantageous intuitive grid style, step-by-step computation, professional debug features, agile syntax, complete computational system, and seamless support for the interactive computations between various databases.

esProc is a scripting tool with a grid style. With esProc, the computational logics can be laid out in a 2D space conveniently. The business algorithm can thus be interpreted in the computer language more easily. esProc advocates the step-by-step computation. To put it concretely, esProc enable users to decompose a complex goal into several simple steps in its grid, and ultimately achieve a complex goal by accomplishing those simple objectives. Designed with the "step-by-step" thoughts, a really practical debug function is introduced with esProc, including various functions like the break point, stepping, run to cursor, start, and end. Unlike the fake debugging script as SQL/SP, esProc can debug the basic steps directly and straightforwardly, needless building a specific intermediate table. The break point can be set in any position without altering the code.

As a big data computing script, esProc supports the true data type of set. A member of a set can be the data of any simple data types, records, and/or other sets. esProc supports the ordered set, which means that users can access to the set member and perform the sequence-number-related computation, for example, ranking, sorting, year-over-year comparison, and link relative ratio comparison. The set-of-set can be used to represent the equal grouping, align grouping, and enum grouping. In addition, users can operate on the individual records in the same way as they used to operate on an object. esProc can represent the complex computation more easily with its agile syntax, for example, computing the relative positions in multi-level groupings, and grouping and summarizing by a specified set.

esProc can empower the stored procedures, and ultimately boost the computational capability of database, reduce the development difficulty for programmers, improve the development efficiency, and facilitate the code maintenance and migration. esProc can implement the complex data algorithms and business logics easily.

Finally, for the above case we've discussed to conclude the inconveniences of stored procedures, let's check out the esProc solution shown below:

More Stories By Jessica Qiu

Jessica Qiu is the editor of Raqsoft. She provides press releases for data computation and data analytics.

@MicroservicesExpo Stories
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
“RackN is a software company and we take how a hybrid infrastructure scenario, which consists of clouds, virtualization, traditional data center technologies - how to make them all work together seamlessly from an operational perspective,” stated Dan Choquette, Founder of RackN, in this SYS-CON.tv interview at @DevOpsSummit at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Synthetic monitoring is hardly a new technology. It’s been around almost as long as the commercial World Wide Web has. But the importance of monitoring the performance and availability of a web application by simulating users’ interactions with that application, from around the globe, has never been more important. We’ve seen prominent vendors in the broad APM space add this technology with new development or partnerships just in the last 18 months.
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
A lot of time, resources and energy has been invested over the past few years on de-siloing development and operations. And with good reason. DevOps is enabling organizations to more aggressively increase their digital agility, while at the same time reducing digital costs and risks. But as 2017 approaches, the hottest trends in DevOps aren’t specifically about dev or ops. They’re about testing, security, and metrics.
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...