Search results for: streaming analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 418

Search results for: streaming analytics

358 Advancing Sustainable Futures: A Study on Low Carbon Ventures

Authors: Gaurav Kumar Sinha

Abstract:

As the world grapples with climate challenges, this study highlights the instrumental role of AWS services in amplifying the impact of LCVs. Their ability to harness the cloud, data analytics, and scalable infrastructure offered by AWS empowers LCVs to innovate, scale, and drive meaningful change in the quest for a sustainable future. This study serves as a rallying cry, urging stakeholders to recognize, embrace, and maximize the potential of AWS-powered solutions in advancing sustainable and resilient global initiatives.

Keywords: low carbon ventures, sustainability solutions, AWS services, data analytics

Procedia PDF Downloads 35
357 Data Analytics of Electronic Medical Records Shows an Age-Related Differences in Diagnosis of Coronary Artery Disease

Authors: Maryam Panahiazar, Andrew M. Bishara, Yorick Chern, Roohallah Alizadehsani, Dexter Hadleye, Ramin E. Beygui

Abstract:

Early detection plays a crucial role in enhancing the outcome for a patient with coronary artery disease (CAD). We utilized a big data analytics platform on ~23,000 patients with CAD from a total of 960,129 UCSF patients in 8 years. We traced the patients from their first encounter with a physician to diagnose and treat CAD. Characteristics such as demographic information, comorbidities, vital, lab tests, medications, and procedures are included. There are statistically significant gender-based differences in patients younger than 60 years old from the time of the first physician encounter to coronary artery bypass grafting (CABG) with a p-value=0.03. There are no significant differences between the patients between 60 and 80 years old (p-value=0.8) and older than 80 (p-value=0.4) with a 95% confidence interval. This recognition would affect significant changes in the guideline for referral of the patients for diagnostic tests expeditiously to improve the outcome by avoiding the delay in treatment.

Keywords: electronic medical records, coronary artery disease, data analytics, young women

Procedia PDF Downloads 123
356 The Impact of Temporal Impairment on Quality of Experience (QoE) in Video Streaming: A No Reference (NR) Subjective and Objective Study

Authors: Muhammad Arslan Usman, Muhammad Rehan Usman, Soo Young Shin

Abstract:

Live video streaming is one of the most widely used service among end users, yet it is a big challenge for the network operators in terms of quality. The only way to provide excellent Quality of Experience (QoE) to the end users is continuous monitoring of live video streaming. For this purpose, there are several objective algorithms available that monitor the quality of the video in a live stream. Subjective tests play a very important role in fine tuning the results of objective algorithms. As human perception is considered to be the most reliable source for assessing the quality of a video stream, subjective tests are conducted in order to develop more reliable objective algorithms. Temporal impairments in a live video stream can have a negative impact on the end users. In this paper we have conducted subjective evaluation tests on a set of video sequences containing temporal impairment known as frame freezing. Frame Freezing is considered as a transmission error as well as a hardware error which can result in loss of video frames on the reception side of a transmission system. In our subjective tests, we have performed tests on videos that contain a single freezing event and also for videos that contain multiple freezing events. We have recorded our subjective test results for all the videos in order to give a comparison on the available No Reference (NR) objective algorithms. Finally, we have shown the performance of no reference algorithms used for objective evaluation of videos and suggested the algorithm that works better. The outcome of this study shows the importance of QoE and its effect on human perception. The results for the subjective evaluation can serve the purpose for validating objective algorithms.

Keywords: objective evaluation, subjective evaluation, quality of experience (QoE), video quality assessment (VQA)

Procedia PDF Downloads 579
355 Visual Analytics of Higher Order Information for Trajectory Datasets

Authors: Ye Wang, Ickjai Lee

Abstract:

Due to the widespread of mobile sensing, there is a strong need to handle trails of moving objects, trajectories. This paper proposes three visual analytic approaches for higher order information of trajectory data sets based on the higher order Voronoi diagram data structure. Proposed approaches reveal geometrical information, topological, and directional information. Experimental results demonstrate the applicability and usefulness of proposed three approaches.

Keywords: visual analytics, higher order information, trajectory datasets, spatio-temporal data

Procedia PDF Downloads 382
354 Structured Access Control Mechanism for Mesh-based P2P Live Streaming Systems

Authors: Chuan-Ching Sue, Kai-Chun Chuang

Abstract:

Peer-to-Peer (P2P) live streaming systems still suffer a challenge when thousands of new peers want to join into the system in a short time, called flash crowd, and most of new peers suffer long start-up delay. Recent studies have proposed a slot-based user access control mechanism, which periodically determines a certain number of new peers to enter the system, and a user batch join mechanism, which divides new peers into several tree structures with fixed tree size. However, the slot-based user access control mechanism is difficult for accurately determining the optimal time slot length, and the user batch join mechanism is hard for determining the optimal tree size. In this paper, we propose a structured access control (SAC) mechanism, which constructs new peers to a multi-layer mesh structure. The SAC mechanism constructs new peer connections layer by layer to replace periodical access control, and determines the number of peers in each layer according to the system’s remaining upload bandwidth and average video rate. Furthermore, we propose an analytical model to represent the behavior of the system growth if the system can utilize the upload bandwidth efficiently. The analytical result has shown the similar trend in system growth as the SAC mechanism. Additionally, the extensive simulation is conducted to show the SAC mechanism outperforms two previously proposed methods in terms of system growth and start-up delay.

Keywords: peer-to-peer, live video streaming system, flash crowd, start-up delay, access control

Procedia PDF Downloads 291
353 Optimizing Energy Efficiency: Leveraging Big Data Analytics and AWS Services for Buildings and Industries

Authors: Gaurav Kumar Sinha

Abstract:

In an era marked by increasing concerns about energy sustainability, this research endeavors to address the pressing challenge of energy consumption in buildings and industries. This study delves into the transformative potential of AWS services in optimizing energy efficiency. The research is founded on the recognition that effective management of energy consumption is imperative for both environmental conservation and economic viability. Buildings and industries account for a substantial portion of global energy use, making it crucial to develop advanced techniques for analysis and reduction. This study sets out to explore the integration of AWS services with big data analytics to provide innovative solutions for energy consumption analysis. Leveraging AWS's cloud computing capabilities, scalable infrastructure, and data analytics tools, the research aims to develop efficient methods for collecting, processing, and analyzing energy data from diverse sources. The core focus is on creating predictive models and real-time monitoring systems that enable proactive energy management. By harnessing AWS's machine learning and data analytics capabilities, the research seeks to identify patterns, anomalies, and optimization opportunities within energy consumption data. Furthermore, this study aims to propose actionable recommendations for reducing energy consumption in buildings and industries. By combining AWS services with metrics-driven insights, the research strives to facilitate the implementation of energy-efficient practices, ultimately leading to reduced carbon emissions and cost savings. The integration of AWS services not only enhances the analytical capabilities but also offers scalable solutions that can be customized for different building and industrial contexts. The research also recognizes the potential for AWS-powered solutions to promote sustainable practices and support environmental stewardship.

Keywords: energy consumption analysis, big data analytics, AWS services, energy efficiency

Procedia PDF Downloads 37
352 The Digital Desert in Global Business: Digital Analytics as an Oasis of Hope for Sub-Saharan Africa

Authors: David Amoah Oduro

Abstract:

In the ever-evolving terrain of international business, a profound revolution is underway, guided by the swift integration and advancement of disruptive technologies like digital analytics. In today's international business landscape, where competition is fierce, and decisions are data-driven, the essence of this paper lies in offering a tangible roadmap for practitioners. It is a guide that bridges the chasm between theory and actionable insights, helping businesses, investors, and entrepreneurs navigate the complexities of international expansion into sub-Saharan Africa. This practitioner paper distils essential insights, methodologies, and actionable recommendations for businesses seeking to leverage digital analytics in their pursuit of market entry and expansion across the African continent. What sets this paper apart is its unwavering focus on a region ripe with potential: sub-Saharan Africa. The adoption and adaptation of digital analytics are not mere luxuries but essential strategic tools for evaluating countries and entering markets within this dynamic region. With the spotlight firmly fixed on sub-Saharan Africa, the aim is to provide a compelling resource to guide practitioners in their quest to unearth the vast opportunities hidden within sub-Saharan Africa's digital desert. The paper illuminates the pivotal role of digital analytics in providing a data-driven foundation for market entry decisions. It highlights the ability to uncover market trends, consumer behavior, and competitive landscapes. By understanding Africa's incredible diversity, the paper underscores the importance of tailoring market entry strategies to account for unique cultural, economic, and regulatory factors. For practitioners, this paper offers a set of actionable recommendations, including the creation of cross-functional teams, the integration of local expertise, and the cultivation of long-term partnerships to ensure sustainable market entry success. It advocates for a commitment to continuous learning and flexibility in adapting strategies as the African market evolves. This paper represents an invaluable resource for businesses, investors, and entrepreneurs who are keen on unlocking the potential of digital analytics for informed market entry in Africa. It serves as a guiding light, equipping practitioners with the essential tools and insights needed to thrive in this dynamic and diverse continent. With these key insights, methodologies, and recommendations, this paper is a roadmap to prosperous and sustainable market entry in Africa. It is vital for anyone looking to harness the transformational potential of digital analytics to create prosperous and sustainable ventures in a region brimming with promise. In the ever-advancing digital age, this practitioner paper becomes a lodestar, guiding businesses and visionaries toward success amidst the unique challenges and rewards of sub-Saharan Africa's international business landscape.

Keywords: global analytics, digital analytics, sub-Saharan Africa, data analytics

Procedia PDF Downloads 37
351 Lamb Wave-Based Blood Coagulation Measurement System Using Citrated Plasma

Authors: Hyunjoo Choi, Jeonghun Nam, Chae Seung Lim

Abstract:

Acoustomicrofluidics has gained much attention due to the advantages, such as noninvasiveness and easy integration with other miniaturized systems, for clinical and biological applications. However, a limitation of acoustomicrofluidics is the complicated and costly fabrication process of electrodes. In this study, we propose a low-cost and lithography-free device using Lamb wave for blood analysis. Using a Lamb wave, calcium ion-removed blood plasma and coagulation reagents can be rapidly mixed for blood coagulation test. Due to the coagulation process, the viscosity of the sample increases and the viscosity change can be monitored by internal acoustic streaming of microparticles suspended in the sample droplet. When the acoustic streaming of particles stops by the viscosity increase is defined as the coagulation time. With the addition of calcium ion at 0-25 mM, the coagulation time was measured and compared with the conventional index for blood coagulation analysis, prothrombin time, which showed highly correlated with the correlation coefficient as 0.94. Therefore, our simple and cost-effective Lamb wave-based blood analysis device has the powerful potential to be utilized in clinical settings.

Keywords: acoustomicrofluidics, blood analysis, coagulation, lamb wave

Procedia PDF Downloads 313
350 "Exploring the Intersection of Accounting, Business, and Economics: Bridging Theory and Practice for Sustainable Growth

Authors: Stephen Acheampong Amoafoh

Abstract:

In today's dynamic economic landscape, businesses face multifaceted challenges that demand strategic foresight and informed decision-making. This abstract explores the pivotal role of financial analytics in driving business performance amidst evolving market conditions. By integrating accounting principles with economic insights, organizations can harness the power of data-driven strategies to optimize resource allocation, mitigate risks, and capitalize on emerging opportunities. This presentation will delve into the practical applications of financial analytics across various sectors, highlighting case studies and empirical evidence to underscore its efficacy in enhancing operational efficiency and fostering sustainable growth. From predictive modeling to performance benchmarking, attendees will gain invaluable insights into leveraging advanced analytics tools to drive profitability, streamline processes, and adapt to changing market dynamics. Moreover, this abstract will address the ethical considerations inherent in financial analytics, emphasizing the importance of transparency, integrity, and accountability in data-driven decision-making. By fostering a culture of ethical conduct and responsible stewardship, organizations can build trust with stakeholders and safeguard their long-term viability in an increasingly interconnected global economy. Ultimately, this abstract aims to stimulate dialogue and collaboration among scholars, practitioners, and policymakers, fostering knowledge exchange and innovation in the realms of accounting, business, and economics. Through interdisciplinary insights and actionable recommendations, participants will be equipped to navigate the complexities of today's business environment and seize opportunities for sustainable success.

Keywords: financial analytics, business performance, data-driven strategies, sustainable growth

Procedia PDF Downloads 17
349 A Basic Metric Model: Foundation for an Evidence-Based HRM System

Authors: K. M. Anusha, R. Krishnaveni

Abstract:

Crossing a decade of the 21st century, the paradigm of human resources can be seen evolving with the strategic gene induced into it. There seems to be a radical shift descending as the corporate sector calls on its HR team to become strategic rather than administrative. This transferal eventually requires the metrics employed by these HR teams not to be just operationally reactive but to be aligned to an evidence-based strategic thinking. Realizing the growing need for a prescriptive metric model for effective HR analytics, this study has designed a conceptual framework for a basic metric model that can assist IT-HRM professionals to transition to a practice of evidence-based decision-making to enhance organizational performance.

Keywords: metric model, evidence based HR, HR analytics, strategic HR practices, IT sector

Procedia PDF Downloads 373
348 Studies on the Teaching Pedagogy and Effectiveness for the Multi-Channel Storytelling for Social Media, Cinema, Game, and Streaming Platform: Case Studies of Squid Game

Authors: Chan Ka Lok Sobel

Abstract:

The rapid evolution of digital media platforms has given rise to new forms of narrative engagement, particularly through multi-channel storytelling. This research focuses on exploring the teaching pedagogy and effectiveness of multi-channel storytelling for social media, cinema, games, and streaming platforms. The study employs case studies of the popular series "Squid Game" to investigate the diverse pedagogical approaches and strategies used in teaching multi-channel storytelling. Through qualitative research methods, including interviews, surveys, and content analysis, the research assesses the effectiveness of these approaches in terms of student engagement, knowledge acquisition, critical thinking skills, and the development of digital literacy. The findings contribute to understanding best practices for incorporating multi-channel storytelling into educational contexts and enhancing learning outcomes in the digital media landscape.

Keywords: digital literacy, game-based learning, artificial intelligence, animation production, educational technology

Procedia PDF Downloads 55
347 Analysing Competitive Advantage of IoT and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic has not only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of normal design, construction, and operation of cities provides a unique opportunity to improve the connection between people. The Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the research contribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: data analytics, smart cities, competitive advantage, internet of things

Procedia PDF Downloads 82
346 Analyzing Competitive Advantage of Internet of Things and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic hasnot only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of the normal design, construction, and operation of cities provides a unique opportunity to improve connection between people. Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the researchcontribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create a competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: internet of things, data analytics, smart cities, competitive advantage

Procedia PDF Downloads 70
345 IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG

Authors: R. Gautam, P. Sastha Kanagasabai, G. N. Rathna

Abstract:

The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service.

Keywords: brain computer interface, Intel Edison, Emotiv EPOC, IoT analytics, electroencephalogram

Procedia PDF Downloads 162
344 Efficient Storage and Intelligent Retrieval of Multimedia Streams Using H. 265

Authors: S. Sarumathi, C. Deepadharani, Garimella Archana, S. Dakshayani, D. Logeshwaran, D. Jayakumar, Vijayarangan Natarajan

Abstract:

The need of the hour for the customers who use a dial-up or a low broadband connection for their internet services is to access HD video data. This can be achieved by developing a new video format using H. 265. This is the latest video codec standard developed by ISO/IEC Moving Picture Experts Group (MPEG) and ITU-T Video Coding Experts Group (VCEG) on April 2013. This new standard for video compression has the potential to deliver higher performance than the earlier standards such as H. 264/AVC. In comparison with H. 264, HEVC offers a clearer, higher quality image at half the original bitrate. At this lower bitrate, it is possible to transmit high definition videos using low bandwidth. It doubles the data compression ratio supporting 8K Ultra HD and resolutions up to 8192×4320. In the proposed model, we design a new video format which supports this H. 265 standard. The major areas of applications in the coming future would lead to enhancements in the performance level of digital television like Tata Sky and Sun Direct, BluRay Discs, Mobile Video, Video Conferencing and Internet and Live Video streaming.

Keywords: access HD video, H. 265 video standard, high performance, high quality image, low bandwidth, new video format, video streaming applications

Procedia PDF Downloads 331
343 Mitigating Supply Chain Risk for Sustainability Using Big Data Knowledge: Evidence from the Manufacturing Supply Chain

Authors: Mani Venkatesh, Catarina Delgado, Purvishkumar Patel

Abstract:

The sustainable supply chain is gaining popularity among practitioners because of increased environmental degradation and stakeholder awareness. On the other hand supply chain, risk management is very crucial for the practitioners as it potentially disrupts supply chain operations. Prediction and addressing the risk caused by social issues in the supply chain is paramount importance to the sustainable enterprise. More recently, the usage of Big data analytics for forecasting business trends has been gaining momentum among professionals. The aim of the research is to explore the application of big data, predictive analytics in successfully mitigating supply chain social risk and demonstrate how such mitigation can help in achieving sustainability (environmental, economic & social). The method involves the identification and validation of social issues in the supply chain by an expert panel and survey. Later, we used a case study to illustrate the application of big data in the successful identification and mitigation of social issues in the supply chain. Our result shows that the company can predict various social issues through big data, predictive analytics and mitigate the social risk. We also discuss the implication of this research to the body of knowledge and practice.

Keywords: big data, sustainability, supply chain social sustainability, social risk, case study

Procedia PDF Downloads 370
342 An Empirical Study of the Impacts of Big Data on Firm Performance

Authors: Thuan Nguyen

Abstract:

In the present time, data to a data-driven knowledge-based economy is the same as oil to the industrial age hundreds of years ago. Data is everywhere in vast volumes! Big data analytics is expected to help firms not only efficiently improve performance but also completely transform how they should run their business. However, employing the emergent technology successfully is not easy, and assessing the roles of big data in improving firm performance is even much harder. There was a lack of studies that have examined the impacts of big data analytics on organizational performance. This study aimed to fill the gap. The present study suggested using firms’ intellectual capital as a proxy for big data in evaluating its impact on organizational performance. The present study employed the Value Added Intellectual Coefficient method to measure firm intellectual capital, via its three main components: human capital efficiency, structural capital efficiency, and capital employed efficiency, and then used the structural equation modeling technique to model the data and test the models. The financial fundamental and market data of 100 randomly selected publicly listed firms were collected. The results of the tests showed that only human capital efficiency had a significant positive impact on firm profitability, which highlighted the prominent human role in the impact of big data technology.

Keywords: big data, big data analytics, intellectual capital, organizational performance, value added intellectual coefficient

Procedia PDF Downloads 212
341 Droplet Impact on a High Frequency Vibrating Surface

Authors: Maryam Ebrahimiazar, Parsia Mohammadshahi, Amirreza Amighi, Nasser Ashgriz

Abstract:

Ultrasonic atomization is used to generate micron size aerosols. In this work, the aerosol formation by the atomization of a parent droplet dripping from a capillary needle onto the surface of a Teflon coated piezoelectric vibrating at 2.5 MHz is studied, and different steps of atomization are categorized. After the droplet impacts on the piezoelectric, surface acoustic streaming deforms the droplet into a fountain shape. This fountain soon collapses and forms a liquid layer. The breakup of the liquid layer results in the generation of both large ( 100 microns) and small drops (few microns). Next, the residual drops from the liquid layer start to be atomized to generate few micron size droplets. The high velocity and explosive aerosol formation in this step are better explained in terms of cavitation theory. However, the combination of both capillary waves and cavitation theory seem to be responsible for few-micron droplet generation. The current study focuses on both qualitative and quantitative aspects of fountain formation for both ethyl-alcohol and water. Even though the general steps of atomization are the same for both liquids, the quantitative results indicate that some noticeable differences lie between them.

Keywords: droplet breakup, ultrasonic atomization, acoustic streaming, droplet oscillation

Procedia PDF Downloads 145
340 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 244
339 Field-Programmable Gate Arrays Based High-Efficiency Oriented Fast and Rotated Binary Robust Independent Elementary Feature Extraction Method Using Feature Zone Strategy

Authors: Huang Bai-Cheng

Abstract:

When deploying the Oriented Fast and Rotated Binary Robust Independent Elementary Feature (BRIEF) (ORB) extraction algorithm on field-programmable gate arrays (FPGA), the access of global storage for 31×31 pixel patches of the features has become the bottleneck of the system efficiency. Therefore, a feature zone strategy has been proposed. Zones are searched as features are detected. Pixels around the feature zones are extracted from global memory and distributed into patches corresponding to feature coordinates. The proposed FPGA structure is targeted on a Xilinx FPGA development board of Zynq UltraScale+ series, and multiple datasets are tested. Compared with the streaming pixel patch extraction method, the proposed architecture obtains at least two times acceleration consuming extra 3.82% Flip-Flops (FFs) and 7.78% Look-Up Tables (LUTs). Compared with the non-streaming one, the proposed architecture saves 22.3% LUT and 1.82% FF, causing a latency of only 0.2ms and a drop in frame rate for 1. Compared with the related works, the proposed strategy and hardware architecture have the superiority of keeping a balance between FPGA resources and performance.

Keywords: feature extraction, real-time, ORB, FPGA implementation

Procedia PDF Downloads 89
338 Achieving High Renewable Energy Penetration in Western Australia Using Data Digitisation and Machine Learning

Authors: A. D. Tayal

Abstract:

The energy industry is undergoing significant disruption. This research outlines that, whilst challenging; this disruption is also an emerging opportunity for electricity utilities. One such opportunity is leveraging the developments in data analytics and machine learning. As the uptake of renewable energy technologies and complimentary control systems increases, electricity grids will likely transform towards dense microgrids with high penetration of renewable generation sources, rich in network and customer data, and linked through intelligent, wireless communications. Data digitisation and analytics have already impacted numerous industries, and its influence on the energy sector is growing, as computational capabilities increase to manage big data, and as machines develop algorithms to solve the energy challenges of the future. The objective of this paper is to address how far the uptake of renewable technologies can go given the constraints of existing grid infrastructure and provides a qualitative assessment of how higher levels of renewable energy penetration can be facilitated by incorporating even broader technological advances in the fields of data analytics and machine learning. Western Australia is used as a contextualised case study, given its abundance and diverse renewable resources (solar, wind, biomass, and wave) and isolated networks, making a high penetration of renewables a feasible target for policy makers over coming decades.

Keywords: data, innovation, renewable, solar

Procedia PDF Downloads 328
337 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: data analytics, green production, industrial energy management, optimization, renewable energies, simulation

Procedia PDF Downloads 412
336 Estimation of Service Quality and Its Impact on Market Share Using Business Analytics

Authors: Haritha Saranga

Abstract:

Service quality has become an important driver of competition in manufacturing industries of late, as many products are being sold in conjunction with service offerings. With increase in computational power and data capture capabilities, it has become possible to analyze and estimate various aspects of service quality at the granular level and determine their impact on business performance. In the current study context, dealer level, model-wise warranty data from one of the top two-wheeler manufacturers in India is used to estimate service quality of individual dealers and its impact on warranty related costs and sales performance. We collected primary data on warranty costs, number of complaints, monthly sales, type of quality upgrades, etc. from the two-wheeler automaker. In addition, we gathered secondary data on various regions in India, such as petrol and diesel prices, geographic and climatic conditions of various regions where the dealers are located, to control for customer usage patterns. We analyze this primary and secondary data with the help of a variety of analytics tools such as Auto-Regressive Integrated Moving Average (ARIMA), Seasonal ARIMA and ARIMAX. Study results, after controlling for a variety of factors, such as size, age, region of the dealership, and customer usage pattern, show that service quality does influence sales of the products in a significant manner. A more nuanced analysis reveals the dynamics between product quality and service quality, and how their interaction affects sales performance in the Indian two-wheeler industry context. We also provide various managerial insights using descriptive analytics and build a model that can provide sales projections using a variety of forecasting techniques.

Keywords: service quality, product quality, automobile industry, business analytics, auto-regressive integrated moving average

Procedia PDF Downloads 99
335 Sensing to Respond & Recover in Emergency

Authors: Alok Kumar, Raviraj Patil

Abstract:

The ability to respond to an incident of a disastrous event in a vulnerable area is very crucial an aspect of emergency management. The ability to constantly predict the likelihood of an event along with its severity in an area and react to those significant events which are likely to have a high impact allows the authorities to respond by allocating resources optimally in a timely manner. It provides for measuring, monitoring, and modeling facilities that integrate underlying systems into one solution to improve operational efficiency, planning, and coordination. We were particularly involved in this innovative incubation work on the current state of research and development in collaboration. technologies & systems for a disaster.

Keywords: predictive analytics, advanced analytics, area flood likelihood model, area flood severity model, level of impact model, mortality score, economic loss score, resource allocation, crew allocation

Procedia PDF Downloads 290
334 Intellectual Property in Digital Environment

Authors: Balamurugan L.

Abstract:

Artificial intelligence (AI) and its applications in Intellectual Property Rights (IPR) has been significantly growing in recent years. In last couple of years, AI tools for Patent Research and Patent Analytics have been well-stabilized in terms of accuracy of references and representation of identified patent insights. However, AI tools for Patent Prosecution and Patent Litigation are still in the nascent stage and there may be a significant potential if such market is explored further. Our research is primarily focused on identifying potential whitespaces and schematic algorithms to automate the Patent Prosecution and Patent Litigation Process of the Intellectual Property. The schematic algorithms may assist leading AI tool developers, to explore such opportunities in the field of Intellectual Property. Our research is also focused on identification of pitfalls of the AI. For example, Information Security and its impact in IPR, and Potential remediations to sustain the IPR in the digital environment.

Keywords: artificial intelligence, patent analytics, patent drafting, patent litigation, patent prosecution, patent research

Procedia PDF Downloads 42
333 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions

Authors: K. Hardy, A. Maurushat

Abstract:

Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.

Keywords: big data, open data, productivity, data governance

Procedia PDF Downloads 341
332 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing

Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh

Abstract:

Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.

Keywords: continual assessment, predictive analytics, random forest, student psychological profile

Procedia PDF Downloads 98
331 A Formal Approach for Instructional Design Integrated with Data Visualization for Learning Analytics

Authors: Douglas A. Menezes, Isabel D. Nunes, Ulrich Schiel

Abstract:

Most Virtual Learning Environments do not provide support mechanisms for the integrated planning, construction and follow-up of Instructional Design supported by Learning Analytic results. The present work aims to present an authoring tool that will be responsible for constructing the structure of an Instructional Design (ID), without the data being altered during the execution of the course. The visual interface aims to present the critical situations present in this ID, serving as a support tool for the course follow-up and possible improvements, which can be made during its execution or in the planning of a new edition of this course. The model for the ID is based on High-Level Petri Nets and the visualization forms are determined by the specific kind of the data generated by an e-course, a population of students generating sequentially dependent data.

Keywords: educational data visualization, high-level petri nets, instructional design, learning analytics

Procedia PDF Downloads 218
330 Syndromic Surveillance Framework Using Tweets Data Analytics

Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden

Abstract:

Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.

Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza

Procedia PDF Downloads 86
329 Automated Detection of Targets and Retrieve the Corresponding Analytics Using Augmented Reality

Authors: Suvarna Kumar Gogula, Sandhya Devi Gogula, P. Chanakya

Abstract:

Augmented reality is defined as the collection of the digital (or) computer generated information like images, audio, video, 3d models, etc. and overlay them over the real time environment. Augmented reality can be thought as a blend between completely synthetic and completely real. Augmented reality provides scope in a wide range of industries like manufacturing, retail, gaming, advertisement, tourism, etc. and brings out new dimensions in the modern digital world. As it overlays the content, it makes the users enhance the knowledge by providing the content blended with real world. In this application, we integrated augmented reality with data analytics and integrated with cloud so the virtual content will be generated on the basis of the data present in the database and we used marker based augmented reality where every marker will be stored in the database with corresponding unique ID. This application can be used in wide range of industries for different business processes, but in this paper, we mainly focus on the marketing industry which helps the customer in gaining the knowledge about the products in the market which mainly focus on their prices, customer feedback, quality, and other benefits. This application also focuses on providing better market strategy information for marketing managers who obtain the data about the stocks, sales, customer response about the product, etc. In this paper, we also included the reports from the feedback got from different people after the demonstration, and finally, we presented the future scope of Augmented Reality in different business processes by integrating with new technologies like cloud, big data, artificial intelligence, etc.

Keywords: augmented reality, data analytics, catch room, marketing and sales

Procedia PDF Downloads 197