Search results for: streaming analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 456

Search results for: streaming analytics

246 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 74
245 Exploring Public Opinions Toward the Use of Generative Artificial Intelligence Chatbot in Higher Education: An Insight from Topic Modelling and Sentiment Analysis

Authors: Samer Muthana Sarsam, Abdul Samad Shibghatullah, Chit Su Mon, Abd Aziz Alias, Hosam Al-Samarraie

Abstract:

Generative Artificial Intelligence chatbots (GAI chatbots) have emerged as promising tools in various domains, including higher education. However, their specific role within the educational context and the level of legal support for their implementation remain unclear. Therefore, this study aims to investigate the role of Bard, a newly developed GAI chatbot, in higher education. To achieve this objective, English tweets were collected from Twitter's free streaming Application Programming Interface (API). The Latent Dirichlet Allocation (LDA) algorithm was applied to extract latent topics from the collected tweets. User sentiments, including disgust, surprise, sadness, anger, fear, joy, anticipation, and trust, as well as positive and negative sentiments, were extracted using the NRC Affect Intensity Lexicon and SentiStrength tools. This study explored the benefits, challenges, and future implications of integrating GAI chatbots in higher education. The findings shed light on the potential power of such tools, exemplified by Bard, in enhancing the learning process and providing support to students throughout their educational journey.

Keywords: generative artificial intelligence chatbots, bard, higher education, topic modelling, sentiment analysis

Procedia PDF Downloads 84
244 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 167
243 Inferring Cognitive Skill in Concept Space

Authors: Rania A. Aboalela, Javed I. Khan

Abstract:

This research presents a learning assessment theory of Cognitive Skill in Concept Space (CS2) to measure the assessed knowledge in terms of cognitive skill levels of the concepts. The cognitive skill levels refer to levels such as if a student has acquired the state at the level of understanding, or applying, or analyzing, etc. The theory is comprised of three constructions: Graph paradigm of a semantic/ ontological scheme, the concept states of the theory and the assessment analytics which is the process to estimate the sets of concept state at a certain skill level. Concept state means if a student has already learned, or is ready to learn, or is not ready to learn a certain skill level. The experiment is conducted to prove the validation of the theory CS2.

Keywords: cognitive skill levels, concept states, concept space, knowledge assessment theory

Procedia PDF Downloads 324
242 Analyzing Consumer Preferences and Brand Differentiation in the Notebook Market via Social Media Insights and Expert Evaluations

Authors: Mohammadreza Bakhtiari, Mehrdad Maghsoudi, Hamidreza Bakhtiari

Abstract:

This study investigates consumer behavior in the notebook computer market by integrating social media sentiment analysis with expert evaluations. The rapid evolution of the notebook industry has intensified competition among manufacturers, necessitating a deeper understanding of consumer priorities. Social media platforms, particularly Twitter, have become valuable sources for capturing real-time user feedback. In this research, sentiment analysis was performed on Twitter data gathered in the last two years, focusing on seven major notebook brands. The PyABSA framework was utilized to extract sentiments associated with various notebook components, including performance, design, battery life, and price. Expert evaluations, conducted using fuzzy logic, were incorporated to assess the impact of these sentiments on purchase behavior. To provide actionable insights, the TOPSIS method was employed to prioritize notebook features based on a combination of consumer sentiments and expert opinions. The findings consistently highlight price, display quality, and core performance components, such as RAM and CPU, as top priorities across brands. However, lower-priority features, such as webcams and cooling fans, present opportunities for manufacturers to innovate and differentiate their products. The analysis also reveals subtle but significant brand-specific variations, offering targeted insights for marketing and product development strategies. For example, Lenovo's strong performance in display quality points to a competitive edge, while Microsoft's lower ranking in battery life indicates a potential area for R&D investment. This hybrid methodology demonstrates the value of combining big data analytics with expert evaluations, offering a comprehensive framework for understanding consumer behavior in the notebook market. The study emphasizes the importance of aligning product development and marketing strategies with evolving consumer preferences, ensuring competitiveness in a dynamic market. It also underscores the potential for innovation in seemingly less important features, providing companies with opportunities to create unique selling points. By bridging the gap between consumer expectations and product offerings, this research equips manufacturers with the tools needed to remain agile in responding to market trends and enhancing customer satisfaction.

Keywords: consumer behavior, customer preferences, laptop industry, notebook computers, social media analytics, TOPSIS

Procedia PDF Downloads 26
241 A Case Study on the Impact of Technology Readiness in a Department of Clinical Nurses

Authors: Julie Delany

Abstract:

To thrive in today’s digital climate, it is vital that organisations adopt new technology and prepare for rising digital trends. This proves more difficult in government where, traditionally, people lack change readiness. While individuals may have a desire to work smarter, this does not necessarily mean embracing technology. This paper discusses the rollout of an application into a small department of highly experienced nurses. The goal was to both streamline the department's workflow and provide a platform for gathering essential business metrics. The biggest challenges were adoption and motivating the nurses to change their routines and learn new computer skills. Two-thirds struggled with the change, and as a result, some jeopardised the validity of the business metrics. In conclusion, there are lessons learned and recommendations for similar projects.

Keywords: change ready, information technology, end-user, iterative method, rollout plan, data analytics

Procedia PDF Downloads 145
240 Real Time Multi Person Action Recognition Using Pose Estimates

Authors: Aishrith Rao

Abstract:

Human activity recognition is an important aspect of video analytics, and many approaches have been recommended to enable action recognition. In this approach, the model is used to identify the action of the multiple people in the frame and classify them accordingly. A few approaches use RNNs and 3D CNNs, which are computationally expensive and cannot be trained with the small datasets which are currently available. Multi-person action recognition has been performed in order to understand the positions and action of people present in the video frame. The size of the video frame can be adjusted as a hyper-parameter depending on the hardware resources available. OpenPose has been used to calculate pose estimate using CNN to produce heap-maps, one of which provides skeleton features, which are basically joint features. The features are then extracted, and a classification algorithm can be applied to classify the action.

Keywords: human activity recognition, computer vision, pose estimates, convolutional neural networks

Procedia PDF Downloads 143
239 A Text Classification Approach Based on Natural Language Processing and Machine Learning Techniques

Authors: Rim Messaoudi, Nogaye-Gueye Gning, François Azelart

Abstract:

Automatic text classification applies mostly natural language processing (NLP) and other AI-guided techniques to automatically classify text in a faster and more accurate manner. This paper discusses the subject of using predictive maintenance to manage incident tickets inside the sociality. It focuses on proposing a tool that treats and analyses comments and notes written by administrators after resolving an incident ticket. The goal here is to increase the quality of these comments. Additionally, this tool is based on NLP and machine learning techniques to realize the textual analytics of the extracted data. This approach was tested using real data taken from the French National Railways (SNCF) company and was given a high-quality result.

Keywords: machine learning, text classification, NLP techniques, semantic representation

Procedia PDF Downloads 103
238 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks

Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel

Abstract:

The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.

Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy

Procedia PDF Downloads 301
237 Resource Framework Descriptors for Interestingness in Data

Authors: C. B. Abhilash, Kavi Mahesh

Abstract:

Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.

Keywords: RDF, interestingness, knowledge base, semantic data

Procedia PDF Downloads 164
236 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases

Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar

Abstract:

Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.

Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning

Procedia PDF Downloads 120
235 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 327
234 The Use of Emerging Technologies in Higher Education Institutions: A Case of Nelson Mandela University, South Africa

Authors: Ayanda P. Deliwe, Storm B. Watson

Abstract:

The COVID-19 pandemic has disrupted the established practices of higher education institutions (HEIs). Most higher education institutions worldwide had to shift from traditional face-to-face to online learning. The online environment and new online tools are disrupting the way in which higher education is presented. Furthermore, the structures of higher education institutions have been impacted by rapid advancements in information and communication technologies. Emerging technologies should not be viewed in a negative light because, as opposed to the traditional curriculum that worked to create productive and efficient researchers, emerging technologies encourage creativity and innovation. Therefore, using technology together with traditional means will enhance teaching and learning. Emerging technologies in higher education not only change the experience of students, lecturers, and the content, but it is also influencing the attraction and retention of students. Higher education institutions are under immense pressure because not only are they competing locally and nationally, but emerging technologies also expand the competition internationally. Emerging technologies have eliminated border barriers, allowing students to study in the country of their choice regardless of where they are in the world. Higher education institutions are becoming indifferent as technology is finding its way into the lecture room day by day. Academics need to utilise technology at their disposal if they want to get through to their students. Academics are now competing for students' attention with social media platforms such as WhatsApp, Snapchat, Instagram, Facebook, TikTok, and others. This is posing a significant challenge to higher education institutions. It is, therefore, critical to pay attention to emerging technologies in order to see how they can be incorporated into the classroom in order to improve educational quality while remaining relevant in the work industry. This study aims to understand how emerging technologies have been utilised at Nelson Mandela University in presenting teaching and learning activities since April 2020. The primary objective of this study is to analyse how academics are incorporating emerging technologies in their teaching and learning activities. This primary objective was achieved by conducting a literature review on clarifying and conceptualising the emerging technologies being utilised by higher education institutions, reviewing and analysing the use of emerging technologies, and will further be investigated through an empirical analysis of the use of emerging technologies at Nelson Mandela University. Findings from the literature review revealed that emerging technology is impacting several key areas in higher education institutions, such as the attraction and retention of students, enhancement of teaching and learning, increase in global competition, elimination of border barriers, and highlighting the digital divide. The literature review further identified that learning management systems, open educational resources, learning analytics, and artificial intelligence are the most prevalent emerging technologies being used in higher education institutions. The identified emerging technologies will be further analysed through an empirical analysis to identify how they are being utilised at Nelson Mandela University.

Keywords: artificial intelligence, emerging technologies, learning analytics, learner management systems, open educational resources

Procedia PDF Downloads 69
233 Enhanced Planar Pattern Tracking for an Outdoor Augmented Reality System

Authors: L. Yu, W. K. Li, S. K. Ong, A. Y. C. Nee

Abstract:

In this paper, a scalable augmented reality framework for handheld devices is presented. The presented framework is enabled by using a server-client data communication structure, in which the search for tracking targets among a database of images is performed on the server-side while pixel-wise 3D tracking is performed on the client-side, which, in this case, is a handheld mobile device. Image search on the server-side adopts a residual-enhanced image descriptors representation that gives the framework a scalability property. The tracking algorithm on the client-side is based on a gravity-aligned feature descriptor which takes the advantage of a sensor-equipped mobile device and an optimized intensity-based image alignment approach that ensures the accuracy of 3D tracking. Automatic content streaming is achieved by using a key-frame selection algorithm, client working phase monitoring and standardized rules for content communication between the server and client. The recognition accuracy test performed on a standard dataset shows that the method adopted in the presented framework outperforms the Bag-of-Words (BoW) method that has been used in some of the previous systems. Experimental test conducted on a set of video sequences indicated the real-time performance of the tracking system with a frame rate at 15-30 frames per second. The presented framework is exposed to be functional in practical situations with a demonstration application on a campus walk-around.

Keywords: augmented reality framework, server-client model, vision-based tracking, image search

Procedia PDF Downloads 275
232 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations

Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso

Abstract:

Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.

Keywords: pipeline, leakage, detection, AI

Procedia PDF Downloads 193
231 Cleaning Performance of High-Frequency, High-Intensity 360 kHz Frequency Operating in Thickness Mode Transducers

Authors: R. Vetrimurugan, Terry Lim, M. J. Goodson, R. Nagarajan

Abstract:

This study investigates the cleaning performance of high intensity 360 kHz frequency on the removal of nano-dimensional and sub-micron particles from various surfaces, uniformity of the cleaning tank and run to run variation of cleaning process. The uniformity of the cleaning tank was measured by two different methods i.e 1. ppbTM meter and 2. Liquid Particle Counting (LPC) technique. In the second method, aluminium metal spacer components was placed at various locations of the cleaning tank (such as centre, top left corner, bottom left corner, top right corner, bottom right corner) and the resultant particles removed by 360 kHz frequency was measured. The result indicates that the energy was distributed more uniformly throughout the entire cleaning vessel even at the corners and edges of the tank when megasonic sweeping technology is applied. The result also shows that rinsing the parts with 360 kHz frequency at final rinse gives lower particle counts, hence higher cleaning efficiency as compared to other frequencies. When megasonic sweeping technology is applied each piezoelectric transducers will operate at their optimum resonant frequency and generates stronger acoustic cavitational force and higher acoustic streaming velocity. These combined forces are helping to enhance the particle removal and at the same time improve the overall cleaning performance. The multiple extractions study was also carried out for various frequencies to measure the cleaning potential and asymptote value.

Keywords: power distribution, megasonic sweeping, cavitation intensity, particle removal, laser particle counting, nano, submicron

Procedia PDF Downloads 418
230 Basketball Game-Related Statistics Discriminating Teams Competing in Basketball Africa League and Euroleague: Comparative Analysis

Authors: Ng'etich K. Stephen

Abstract:

Abstract—Globally analytics in basketball has advanced tremendously in the last decade. Organizations are leveraging the insights to improve team and player performance and, in the long run, generate revenue out of it. Due to limited basketball game-related statistics in African competitions, teams are unaware of how they compete with other continental basketball teams. The purpose of this study is to evaluate the regional difference in basketball game-related statistics between African teams that played in the newly formed league, the basketball African league and the European league. The basketball African league, a competition created through the partnership between NBA and FIBA, offers a good starting point since it has valuable basketball metrics to analyze. This study sought to use multivariate linear discriminant analysis to identify the game-related statistics that discriminate the teams in Euro league and the basketball African league.

Keywords: basketball africa league, basketball, euroleague, fiba, africa

Procedia PDF Downloads 103
229 AI In Health and Wellbeing - A Seven-Step Engineering Method

Authors: Denis Özdemir, Max Senges

Abstract:

There are many examples of AI-supported apps for better health and wellbeing. Generally, these applications help people to achieve their goals based on scientific research and input data. Still, they do not always explain how those three are related, e.g. by making implicit assumptions about goals that hold for many but not for all. We present a seven-step method for designing health and wellbeing AIs considering goal setting, measurable results, real-time indicators, analytics, visual representations, communication, and feedback. It can help engineers as guidance in developing apps, recommendation algorithms, and interfaces that support humans in their decision-making without patronization. To illustrate the method, we create a recommender AI for tiny wellbeing habits and run a small case study, including a survey. From the results, we infer how people perceive the relationship between them and the AI and to what extent it helps them to achieve their goals. We review our seven-step engineering method and suggest modifications for the next iteration.

Keywords: recommender systems, natural language processing, health apps, engineering methods

Procedia PDF Downloads 166
228 Using Multi-Arm Bandits to Optimize Game Play Metrics and Effective Game Design

Authors: Kenny Raharjo, Ramon Lawrence

Abstract:

Game designers have the challenging task of building games that engage players to spend their time and money on the game. There are an infinite number of game variations and design choices, and it is hard to systematically determine game design choices that will have positive experiences for players. In this work, we demonstrate how multi-arm bandits can be used to automatically explore game design variations to achieve improved player metrics. The advantage of multi-arm bandits is that they allow for continuous experimentation and variation, intrinsically converge to the best solution, and require no special infrastructure to use beyond allowing minor game variations to be deployed to users for evaluation. A user study confirms that applying multi-arm bandits was successful in determining the preferred game variation with highest play time metrics and can be a useful technique in a game designer's toolkit.

Keywords: game design, multi-arm bandit, design exploration and data mining, player metric optimization and analytics

Procedia PDF Downloads 511
227 Psychological Impacts of Over-the-Top Services on Consumer Behaviors during the COVID-19 Pandemic

Authors: Hector Liu, Chih-Ming Tsai

Abstract:

Consumer behaviors in the subscription of over-the-top (OTT) media services have substantially changed because of the COVID-19 pandemic; hence, this study aims to determine the factors affecting subscription intentions. The increased usage of OTT media, particularly in the lockdowns during the COVID-19 pandemic, has intensified the competition between both global and local streaming providers. While studies have discussed antecedents accounting for this change, they have paid limited attention to the psychological factors that shape consumer behavior in using OTT services. Given the changes in consumers’ psychological states during the pandemic, this study seeks to fill the research gap by integrating the expectancy-value model to provide insights into the key gratifications that consumers seek and obtain and that have affected their subscription to OTT services. This study proposes a theoretical model and assesses this framework on data collected from 1,068 OTT service users in Taiwan. The results strengthen the literature by indicating a clear growth in the popularity and subscription of OTT services because of the COVID-19 lockdowns as well as factors such as perceived quality and satisfaction, which influence behavioral intentions for OTT services. Most crucially, however, OTT viewers who acquired a sense of belonging, a sense of being accompanied, and a sense of reduction in anxiety due to being quarantined and in lockdown show a higher tendency to continue their subscriptions to their OTT services of choice during the pandemic. With consumer behavior trends forever changed by the COVID-19 pandemic, the implications from this study provide OTT service platforms with an opportunity to capitalize on their current and potential customers’ changing desires, demands, and factors for a continued subscription.

Keywords: consumer behavior, COVID-19, expectancy-value model, OTT media services

Procedia PDF Downloads 121
226 User Experience and Impact of AI Features in AutoCAD

Authors: Sarah Alnafea, Basmah Alalsheikh, Hadab Alkathiri

Abstract:

For over 30 years, AutoCAD, a powerful CAD software developed by Autodesk, has been an imperative need for design in industries such as engineering, building, and architecture. With the emerge of advanced technology, AutoCAD has undergone a revolutionary change with the involvement of artificial intelligence capabilities that have enhanced the productivity and efficiency at work and quality in the design for the users. This paper investigates the role AI in AutoCAD, especially in intelligent automation, generative design, automated design ideas, natural language processing, and predictive analytics. To identify further, A survey among users was also conducted to assess the adoption and satisfaction of AI features and identify areas for improvement. The Competitive standing of AutoCAD is further crosschecked against other AI-enabled CAD software packages, including SolidWorks, Fusion 360, and Rhino.In this paper, an overview of the current impacts of AI in AutoCAD is given, along with some recommendations for the future road of AI development to meet users’ requirements

Keywords: artificail inteligence, natural language proccesing, intelligent automation, generative design

Procedia PDF Downloads 5
225 Optimization of Roster Construction In Sports

Authors: Elijah Cavan

Abstract:

In Major League Sports (MLB, NBA, NHL, NFL), it is the Front Office Staff (FOS) who make decisions about who plays for their respective team. The FOS bear the brunt of the responsibility for acquiring players through drafting, trading and signing players in free agency while typically contesting with maximum roster salary constraints. The players themselves are volatile assets of these teams- their value fluctuates with age and performance. A simple comparison can be made when viewing players as assets. The problem here is similar to that of optimizing your investment portfolio. The The goal is ultimately to maximize your periodic returns while tolerating a fixed risk (degree of uncertainty/ potential loss). Each franchise may value assets differently, and some may only tolerate lower risk levels- these are examples of factors that introduce additional constraints into the model. In this talk, we will detail the mathematical formulation of this problem as a constrained optimization problem- which can be solved with classical machine learning methods but is also well posed as a problem to be solved on quantum computers

Keywords: optimization, financial mathematics, sports analytics, simulated annealing

Procedia PDF Downloads 122
224 Comparison and Validation of a dsDNA biomimetic Quality Control Reference for NGS based BRCA CNV analysis versus MLPA

Authors: A. Delimitsou, C. Gouedard, E. Konstanta, A. Koletis, S. Patera, E. Manou, K. Spaho, S. Murray

Abstract:

Background: There remains a lack of International Standard Control Reference materials for Next Generation Sequencing-based approaches or device calibration. We have designed and validated dsDNA biomimetic reference materials for targeted such approaches incorporating proprietary motifs (patent pending) for device/test calibration. They enable internal single-sample calibration, alleviating sample comparisons to pooled historical population-based data assembly or statistical modelling approaches. We have validated such an approach for BRCA Copy Number Variation analytics using iQRS™-CNVSUITE versus Mixed Ligation-dependent Probe Amplification. Methods: Standard BRCA Copy Number Variation analysis was compared between mixed ligation-dependent probe amplification and next generation sequencing using a cohort of 198 breast/ovarian cancer patients. Next generation sequencing based copy number variation analysis of samples spiked with iQRS™ dsDNA biomimetics were analysed using proprietary CNVSUITE software. Mixed ligation-dependent probe amplification analyses were performed on an ABI-3130 Sequencer and analysed with Coffalyser software. Results: Concordance of BRCA – copy number variation events for mixed ligation-dependent probe amplification and CNVSUITE indicated an overall sensitivity of 99.88% and specificity of 100% for iQRS™-CNVSUITE. The negative predictive value of iQRS-CNVSUITE™ for BRCA was 100%, allowing for accurate exclusion of any event. The positive predictive value was 99.88%, with no discrepancy between mixed ligation-dependent probe amplification and iQRS™-CNVSUITE. For device calibration purposes, precision was 100%, spiking of patient DNA demonstrated linearity to 1% (±2.5%) and range from 100 copies. Traditional training was supplemented by predefining the calibrator to sample cut-off (lock-down) for amplicon gain or loss based upon a relative ratio threshold, following training of iQRS™-CNVSUITE using spiked iQRS™ calibrator and control mocks. BRCA copy number variation analysis using iQRS™-CNVSUITE™ was successfully validated and ISO15189 accredited and now enters CE-IVD performance evaluation. Conclusions: The inclusion of a reference control competitor (iQRS™ dsDNA mimetic) to next generation sequencing-based sequencing offers a more robust sample-independent approach for the assessment of copy number variation events compared to mixed ligation-dependent probe amplification. The approach simplifies data analyses, improves independent sample data analyses, and allows for direct comparison to an internal reference control for sample-specific quantification. Our iQRS™ biomimetic reference materials allow for single sample copy number variation analytics and further decentralisation of diagnostics to single patient sample assessment.

Keywords: validation, diagnostics, oncology, copy number variation, reference material, calibration

Procedia PDF Downloads 66
223 Data-Driven Crop Advisory – A Use Case on Grapes

Authors: Shailaja Grover, Purvi Tiwari, Vigneshwaran S. R., U. Dinesh Kumar

Abstract:

In India, grapes are one of the most important horticulture crops. Grapes are most vulnerable to downy mildew, which is one of the most devasting diseases. In the absence of a precise weather-based advisory system, farmers spray pesticides on their crops extensively. There are two main challenges associated with using these pesticides. Firstly, most of these sprays were panic sprays, which could have been avoided. Second, farmers use more expensive "Preventive and Eradicate" chemicals than "Systemic, Curative and Anti-sporulate" chemicals. When these chemicals are used indiscriminately, they can enter the fruit and cause health problems such as cancer. This paper utilizes decision trees and predictive modeling techniques to provide grape farmers with customized advice on grape disease management. This model is expected to reduce the overall use of chemicals by approximately 50% and the cost by around 70%. Most of the grapes produced will have relatively low residue levels of pesticides, i.e., below the permissible level.

Keywords: analytics in agriculture, downy mildew, weather based advisory, decision tree, predictive modelling

Procedia PDF Downloads 74
222 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems

Authors: Z. Bouattou, R. Laurini, H. Belbachir

Abstract:

This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.

Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems

Procedia PDF Downloads 403
221 The Efficacy of Open Educational Resources in Students’ Performance and Engagement

Authors: Huda Al-Shuaily, E. M. Lacap

Abstract:

Higher Education is one of the most essential fundamentals for the advancement and progress of a country. It demands to be as accessible as possible and as comprehensive as it can be reached. In this paper, we succeeded to expand the accessibility and delivery of higher education using an Open Educational Resources (OER), a freely accessible, openly licensed documents, and media for teaching and learning. This study creates a comparative design of student’s academic performance on the course Introduction to Database and student engagement to the virtual learning environment (VLE). The study was done in two successive semesters - one without using the OER and the other is using OER. In the study, we established that there is a significant increase in student’s engagement in VLE in the latter semester compared to the former. By using the latter semester’s data, we manage to show that the student’s engagement has a positive impact on students’ academic performance. Moreso, after clustering their academic performance, the impact is seen higher for students who are low performing. The results show that these engagements can be used to potentially predict the learning styles of the student with a high degree of precision.

Keywords: EDM, learning analytics, moodle, OER, student-engagement

Procedia PDF Downloads 339
220 Content Analysis of ‘Junk Food’ Content in Children’s TV Programmes: A Comparison of UK Broadcast TV and Video-On-Demand Services

Authors: Shreesh Sinha, Alexander B. Barker, Megan Parkin, Emma Wilson, Rachael L. Murray

Abstract:

Background and Objectives: Exposure to HFSS imagery is associated with the consumption of foods high in fat, sugar or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children's television channels in the UK and the other of a selection of children's programmes available on video-on-demand (VOD) streaming sites. Methods: Content analysis of three days' worth of programmes (including advertisements) on two popular children's television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children's programmes available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, 'Milkshake' had a significantly higher proportion of programmes/adverts which contained HFSS content than 'CBeebies'. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programmes containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children's television channels and children's programmes on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children's television programmes broadcast either on TV or VOD services are likely to have an effect on HFSS consumption in children, and legislative opportunities to prevent this exposure are being missed.

Keywords: public health, junk food, children's TV, HFSS

Procedia PDF Downloads 104
219 Exploring Artificial Intelligence as a Transformative Tool for Urban Management

Authors: R. R. Govind

Abstract:

In the digital age, artificial intelligence (AI) is having a significant impact on the rapid changes that cities are experiencing. This study explores the profound impact of AI on urban morphology, especially with regard to promoting friendly design choices. It addresses a significant research gap by examining the real-world effects of integrating AI into urban design and management. The main objective is to outline a framework for integrating AI to transform urban settings. The study employs an urban design framework to effectively navigate complicated urban environments, emphasize the need for urban management, and provide efficient planning and design strategies. Taking Gangtok's informal settlements as a focal point, the study employs AI methodologies such as machine learning, predictive analytics, and generative AI to tackle issues of 'urban informality'. The insights garnered not only offer valuable perspectives but also unveil AI's transformative potential in addressing contemporary urban challenges.

Keywords: urban design, artificial intelligence, urban challenges, machine learning, urban informality

Procedia PDF Downloads 61
218 A Study on the New Weapon Requirements Analytics Using Simulations and Big Data

Authors: Won Il Jung, Gene Lee, Luis Rabelo

Abstract:

Since many weapon systems are getting more complex and diverse, various problems occur in terms of the acquisition cost, time, and performance limitation. As a matter of fact, the experiment execution in real world is costly, dangerous, and time-consuming to obtain Required Operational Characteristics (ROC) for a new weapon acquisition although enhancing the fidelity of experiment results. Also, until presently most of the research contained a large amount of assumptions so therefore a bias is present in the experiment results. At this moment, the new methodology is proposed to solve these problems without a variety of assumptions. ROC of the new weapon system is developed through the new methodology, which is a way to analyze big data generated by simulating various scenarios based on virtual and constructive models which are involving 6 Degrees of Freedom (6DoF). The new methodology enables us to identify unbiased ROC on new weapons by reducing assumptions and provide support in terms of the optimal weapon systems acquisition.

Keywords: big data, required operational characteristics (ROC), virtual and constructive models, weapon acquisition

Procedia PDF Downloads 290
217 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 323