Search results for: streaming analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 456

Search results for: streaming analytics

306 Resource Allocation Scheme For IEEE802.16 Networks

Authors: Elmabruk Laias

Abstract:

IEEE Standard 802.16 provides QoS (Quality of Service) for the applications such as Voice over IP, video streaming and high bandwidth file transfer. With the ability of broadband wireless access of an IEEE 802.16 system, a WiMAX TDD frame contains one downlink subframe and one uplink subframe. The capacity allocated to each subframe is a system parameter that should be determined based on the expected traffic conditions. a proper resource allocation scheme for packet transmissions is imperatively needed. In this paper, we present a new resource allocation scheme, called additional bandwidth yielding (ABY), to improve transmission efficiency of an IEEE 802.16-based network. Our proposed scheme can be adopted along with the existing scheduling algorithms and the multi-priority scheme without any change. The experimental results show that by using our ABY, the packet queuing delay could be significantly improved, especially for the service flows of higher-priority classes.

Keywords: IEEE 802.16, WiMAX, OFDMA, resource allocation, uplink-downlink mapping

Procedia PDF Downloads 476
305 Effect of Ultrasonic Vibration on the Dilution, Mechanical, and Metallurgical Properties in Cladding of 308 on Mild Steel

Authors: Sandeep Singh Sandhu, Karanvir Singh Ghuman, Parminder Singh Saini

Abstract:

The aim of the present investigation was to study the effect of ultrasonic vibration on the cladding of the AISI 308 on the mild steel plates using the shielded metal arc welding (SMAW). Ultrasonic vibrations were applied to molten austenitic stainless steel during the welding process. Due to acoustically induced cavitations and streaming there is a complete mixture of the clad metal and the base metal. It was revealed that cladding of AISI 308 over mild steel along with ultrasonic vibrations result in uniform and finer grain structures. The effect of the vibration on the dilution, mechanical properties and metallographic studies were also studied. It was found that the welding done using the ultrasonic vibration has the less dilution and CVN value for the vibrated sample was also high.

Keywords: surfacing, ultrasonic vibrations, mechanical properties, shielded metal arc welding

Procedia PDF Downloads 495
304 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence

Authors: Sogand Barghi

Abstract:

The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.

Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting

Procedia PDF Downloads 71
303 Indian Premier League (IPL) Score Prediction: Comparative Analysis of Machine Learning Models

Authors: Rohini Hariharan, Yazhini R, Bhamidipati Naga Shrikarti

Abstract:

In the realm of cricket, particularly within the context of the Indian Premier League (IPL), the ability to predict team scores accurately holds significant importance for both cricket enthusiasts and stakeholders alike. This paper presents a comprehensive study on IPL score prediction utilizing various machine learning algorithms, including Support Vector Machines (SVM), XGBoost, Multiple Regression, Linear Regression, K-nearest neighbors (KNN), and Random Forest. Through meticulous data preprocessing, feature engineering, and model selection, we aimed to develop a robust predictive framework capable of forecasting team scores with high precision. Our experimentation involved the analysis of historical IPL match data encompassing diverse match and player statistics. Leveraging this data, we employed state-of-the-art machine learning techniques to train and evaluate the performance of each model. Notably, Multiple Regression emerged as the top-performing algorithm, achieving an impressive accuracy of 77.19% and a precision of 54.05% (within a threshold of +/- 10 runs). This research contributes to the advancement of sports analytics by demonstrating the efficacy of machine learning in predicting IPL team scores. The findings underscore the potential of advanced predictive modeling techniques to provide valuable insights for cricket enthusiasts, team management, and betting agencies. Additionally, this study serves as a benchmark for future research endeavors aimed at enhancing the accuracy and interpretability of IPL score prediction models.

Keywords: indian premier league (IPL), cricket, score prediction, machine learning, support vector machines (SVM), xgboost, multiple regression, linear regression, k-nearest neighbors (KNN), random forest, sports analytics

Procedia PDF Downloads 55
302 Challenges in E-Government: Conceptual Views and Solutions

Authors: Rasim Alguliev, Farhad Yusifov

Abstract:

Considering the international experience, conceptual and architectural principles of forming of electron government are researched and some suggestions were made. The assessment of monitoring of forming processes of electron government, intellectual analysis of web-resources, provision of information security, electron democracy problems were researched, conceptual approaches were suggested. By taking into consideration main principles of electron government theory, important research directions were specified.

Keywords: electron government, public administration, information security, web-analytics, social networks, data mining

Procedia PDF Downloads 475
301 Beyond Personal Evidence: Using Learning Analytics and Student Feedback to Improve Learning Experiences

Authors: Shawndra Bowers, Allie Brandriet, Betsy Gilbertson

Abstract:

This paper will highlight how Auburn Online’s instructional designers leveraged student and faculty data to update and improve online course design and instructional materials. When designing and revising online courses, it can be difficult for faculty to know what strategies are most likely to engage learners and improve educational outcomes in a specific discipline. It can also be difficult to identify which metrics are most useful for understanding and improving teaching, learning, and course design. At Auburn Online, the instructional designers use a suite of data based student’s performance, participation, satisfaction, and engagement, as well as faculty perceptions, to inform sound learning and design principles that guide growth-mindset consultations with faculty. The consultations allow the instructional designer, along with the faculty member, to co-create an actionable course improvement plan. Auburn Online gathers learning analytics from a variety of sources that any instructor or instructional design team may have access to at their own institutions. Participation and performance data, such as page: views, assignment submissions, and aggregate grade distributions, are collected from the learning management system. Engagement data is pulled from the video hosting platform, which includes unique viewers, views and downloads, the minutes delivered, and the average duration each video is viewed. Student satisfaction is also obtained through a short survey that is embedded at the end of each instructional module. This survey is included in each course every time it is taught. The survey data is then analyzed by an instructional designer for trends and pain points in order to identify areas that can be modified, such as course content and instructional strategies, to better support student learning. This analysis, along with the instructional designer’s recommendations, is presented in a comprehensive report to instructors in an hour-long consultation where instructional designers collaborate with the faculty member on how and when to implement improvements. Auburn Online has developed a triage strategy of priority 1 or 2 level changes that will be implemented in future course iterations. This data-informed decision-making process helps instructors focus on what will best work in their teaching environment while addressing which areas need additional attention. As a student-centered process, it has created improved learning environments for students and has been well received by faculty. It has also shown to be effective in addressing the need for improvement while removing the feeling the faculty’s teaching is being personally attacked. The process that Auburn Online uses is laid out, along with the three-tier maintenance and revision guide that will be used over a three-year implementation plan. This information can help others determine what components of the maintenance and revision plan they want to utilize, as well as guide them on how to create a similar approach. The data will be used to analyze, revise, and improve courses by providing recommendations and models of good practices through determining and disseminating best practices that demonstrate an impact on student success.

Keywords: data-driven, improvement, online courses, faculty development, analytics, course design

Procedia PDF Downloads 62
300 Enhancing Operational Efficiency and Patient Care at Johns Hopkins Aramco Healthcare through a Business Intelligence Framework

Authors: Muneera Mohammed Al-Dossary, Fatimah Mohammed Al-Dossary, Mashael Al-Shahrani, Amal Al-Tammemi

Abstract:

Johns Hopkins Aramco Healthcare (JAHA), a joint venture between Saudi Aramco and Johns Hopkins Medicine, delivers comprehensive healthcare services to a diverse patient population. Despite achieving high patient satisfaction rates and surpassing several operational targets, JAHA faces challenges such as appointment delays and resource inefficiencies. These issues highlight the need for an advanced, integrated approach to operational management. This paper proposes a Business Intelligence (BI) framework to address these challenges, leveraging tools such as Epic electronic health records and Tableau dashboards. The framework focuses on data integration, real-time monitoring, and predictive analytics to streamline operations and enhance decision-making. Key outcomes include reduced wait times (e.g., a 23% reduction in specialty clinic wait times) and improved operating room efficiency (from 95.83% to 98% completion rates). These advancements align with JAHA’s strategic objectives of optimizing resource utilization and delivering superior patient care. The findings underscore the transformative potential of BI in healthcare, enabling a shift from reactive to proactive operations management. The success of this implementation lays the foundation for future innovations, including machine learning models for more precise demand forecasting and resource allocation.

Keywords: business intelligence, operational efficiency, healthcare management, predictive analytics, patient care improvement, data integration, real-time monitoring, resource optimization, Johns Hopkins Aramco Healthcare, electronic health records, Tableau dashboards, predictive modeling, efficiency metrics, resource utilization, patient satisfaction

Procedia PDF Downloads 8
299 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe

Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis

Abstract:

The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.

Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM

Procedia PDF Downloads 427
298 Trip Reduction in Turbo Machinery

Authors: Pranay Mathur, Carlo Michelassi, Simi Karatha, Gilda Pedoto

Abstract:

Industrial plant uptime is top most importance for reliable, profitable & sustainable operation. Trip and failed start has major impact on plant reliability and all plant operators focussed on efforts required to minimise the trips & failed starts. The performance of these CTQs are measured with 2 metrics, MTBT(Mean time between trips) and SR (Starting reliability). These metrics helps to identify top failure modes and identify units need more effort to improve plant reliability. Baker Hughes Trip reduction program structured to reduce these unwanted trip 1. Real time machine operational parameters remotely available and capturing the signature of malfunction including related boundary condition. 2. Real time alerting system based on analytics available remotely. 3. Remote access to trip logs and alarms from control system to identify the cause of events. 4. Continuous support to field engineers by remotely connecting with subject matter expert. 5. Live tracking of key CTQs 6. Benchmark against fleet 7. Break down to the cause of failure to component level 8. Investigate top contributor, identify design and operational root cause 9. Implement corrective and preventive action 10. Assessing effectiveness of implemented solution using reliability growth models. 11. Develop analytics for predictive maintenance With this approach , Baker Hughes team is able to support customer in achieving their Reliability Key performance Indicators for monitored units, huge cost savings for plant operators. This Presentation explains these approach while providing successful case studies, in particular where 12nos. of LNG and Pipeline operators with about 140 gas compressing line-ups has adopted these techniques and significantly reduce the number of trips and improved MTBT

Keywords: reliability, availability, sustainability, digital infrastructure, weibull, effectiveness, automation, trips, fail start

Procedia PDF Downloads 76
297 A Case Study: Social Network Analysis of Construction Design Teams

Authors: Elif D. Oguz Erkal, David Krackhardt, Erica Cochran-Hameen

Abstract:

Even though social network analysis (SNA) is an abundantly studied concept for many organizations and industries, a clear SNA approach to the project teams has not yet been adopted by the construction industry. The main challenges for performing SNA in construction and the apparent reason for this gap is the unique and complex structure of each construction project, the comparatively high circulation of project team members/contributing parties and the variety of authentic problems for each project. Additionally, there are stakeholders from a variety of professional backgrounds collaborating in a high-stress environment fueled by time and cost constraints. Within this case study on Project RE, a design & build project performed at the Urban Design Build Studio of Carnegie Mellon University, social network analysis of the project design team will be performed with the main goal of applying social network theory to construction project environments. The research objective is to determine a correlation between the network of how individuals relate to each other on one’s perception of their own professional strengths and weaknesses and the communication patterns within the team and the group dynamics. Data is collected through a survey performed over four rounds conducted monthly, detailed follow-up interviews and constant observations to assess the natural alteration in the network with the effect of time. The data collected is processed by the means of network analytics and in the light of the qualitative data collected with observations and individual interviews. This paper presents the full ethnography of this construction design team of fourteen architecture students based on an elaborate social network data analysis over time. This study is expected to be used as an initial step to perform a refined, targeted and large-scale social network data collection in construction projects in order to deduce the impacts of social networks on project performance and suggest better collaboration structures for construction project teams henceforth.

Keywords: construction design teams, construction project management, social network analysis, team collaboration, network analytics

Procedia PDF Downloads 202
296 Game Structure and Spatio-Temporal Action Detection in Soccer Using Graphs and 3D Convolutional Networks

Authors: Jérémie Ochin

Abstract:

Soccer analytics are built on two data sources: the frame-by-frame position of each player on the terrain and the sequences of events, such as ball drive, pass, cross, shot, throw-in... With more than 2000 ball-events per soccer game, their precise and exhaustive annotation, based on a monocular video stream such as a TV broadcast, remains a tedious and costly manual task. State-of-the-art methods for spatio-temporal action detection from a monocular video stream, often based on 3D convolutional neural networks, are close to reach levels of performances in mean Average Precision (mAP) compatibles with the automation of such task. Nevertheless, to meet their expectation of exhaustiveness in the context of data analytics, such methods must be applied in a regime of high recall – low precision, using low confidence score thresholds. This setting unavoidably leads to the detection of false positives that are the product of the well documented overconfidence behaviour of neural networks and, in this case, their limited access to contextual information and understanding of the game: their predictions are highly unstructured. Based on the assumption that professional soccer players’ behaviour, pose, positions and velocity are highly interrelated and locally driven by the player performing a ball-action, it is hypothesized that the addition of information regarding surrounding player’s appearance, positions and velocity in the prediction methods can improve their metrics. Several methods are compared to build a proper representation of the game surrounding a player, from handcrafted features of the local graph, based on domain knowledge, to the use of Graph Neural Networks trained in an end-to-end fashion with existing state-of-the-art 3D convolutional neural networks. It is shown that the inclusion of information regarding surrounding players helps reaching higher metrics.

Keywords: fine-grained action recognition, human action recognition, convolutional neural networks, graph neural networks, spatio-temporal action recognition

Procedia PDF Downloads 29
295 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 129
294 The Translational Fandom of Marvel Cinematic Universe in the Outlier of Chinese Television Culture

Authors: Xiao Yao

Abstract:

The escalating tech innovation in new media culture is liberating audiences from passive consumption to more productive and critical engagement with the legacy and streaming television media. However, how fan translation is furthering the reception and interpretation of global screen stories remains the outlier of television studies. This paper will showcase the fan-based cross-cultural engagement with the Marvel Cinematic Universe (MCU) in China. This is to highlight: 1) the ways marginal audiences (Chinese MCU fans) seek to sync with the recent telecinematic expansion of MCU; 2) the forensic and interpretative works done by Chinese MCU fans who persistently seek to amplify the pleasure of MCU content in their media contexts; 3) the crucial but largely unacknowledged cultural value generated by Chinese MCU fandom in the outlier of contemporary Chinese TV culture. Taken together, this study aims to further explore the notion of “translational fandom” and integrate its theorisation into the present research in television culture.

Keywords: Chinese MCU fans, cross-cultural engagement, Loki, television media, translational fandom

Procedia PDF Downloads 129
293 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability

Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli

Abstract:

Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.

Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability

Procedia PDF Downloads 147
292 Cognitive Footprints: Analytical and Predictive Paradigm for Digital Learning

Authors: Marina Vicario, Amadeo Argüelles, Pilar Gómez, Carlos Hernández

Abstract:

In this paper, the Computer Research Network of the National Polytechnic Institute of Mexico proposes a paradigmatic model for the inference of cognitive patterns in digital learning systems. This model leads to metadata architecture useful for analysis and prediction in online learning systems; especially on MOOc's architectures. The model is in the design phase and expects to be tested through an institutional of courses project which is going to develop for the MOOc.

Keywords: cognitive footprints, learning analytics, predictive learning, digital learning, educational computing, educational informatics

Procedia PDF Downloads 478
291 Developing a Cloud Intelligence-Based Energy Management Architecture Facilitated with Embedded Edge Analytics for Energy Conservation in Demand-Side Management

Authors: Yu-Hsiu Lin, Wen-Chun Lin, Yen-Chang Cheng, Chia-Ju Yeh, Yu-Chuan Chen, Tai-You Li

Abstract:

Demand-Side Management (DSM) has the potential to reduce electricity costs and carbon emission, which are associated with electricity used in the modern society. A home Energy Management System (EMS) commonly used by residential consumers in a down-stream sector of a smart grid to monitor, control, and optimize energy efficiency to domestic appliances is a system of computer-aided functionalities as an energy audit for residential DSM. Implementing fault detection and classification to domestic appliances monitored, controlled, and optimized is one of the most important steps to realize preventive maintenance, such as residential air conditioning and heating preventative maintenance in residential/industrial DSM. In this study, a cloud intelligence-based green EMS that comes up with an Internet of Things (IoT) technology stack for residential DSM is developed. In the EMS, Arduino MEGA Ethernet communication-based smart sockets that module a Real Time Clock chip to keep track of current time as timestamps via Network Time Protocol are designed and implemented for readings of load phenomena reflecting on voltage and current signals sensed. Also, a Network-Attached Storage providing data access to a heterogeneous group of IoT clients via Hypertext Transfer Protocol (HTTP) methods is configured to data stores of parsed sensor readings. Lastly, a desktop computer with a WAMP software bundle (the Microsoft® Windows operating system, Apache HTTP Server, MySQL relational database management system, and PHP programming language) serves as a data science analytics engine for dynamic Web APP/REpresentational State Transfer-ful web service of the residential DSM having globally-Advanced Internet of Artificial Intelligence (AI)/Computational Intelligence. Where, an abstract computing machine, Java Virtual Machine, enables the desktop computer to run Java programs, and a mash-up of Java, R language, and Python is well-suited and -configured for AI in this study. Having the ability of sending real-time push notifications to IoT clients, the desktop computer implements Google-maintained Firebase Cloud Messaging to engage IoT clients across Android/iOS devices and provide mobile notification service to residential/industrial DSM. In this study, in order to realize edge intelligence that edge devices avoiding network latency and much-needed connectivity of Internet connections for Internet of Services can support secure access to data stores and provide immediate analytical and real-time actionable insights at the edge of the network, we upgrade the designed and implemented smart sockets to be embedded AI Arduino ones (called embedded AIduino). With the realization of edge analytics by the proposed embedded AIduino for data analytics, an Arduino Ethernet shield WizNet W5100 having a micro SD card connector is conducted and used. The SD library is included for reading parsed data from and writing parsed data to an SD card. And, an Artificial Neural Network library, ArduinoANN, for Arduino MEGA is imported and used for locally-embedded AI implementation. The embedded AIduino in this study can be developed for further applications in manufacturing industry energy management and sustainable energy management, wherein in sustainable energy management rotating machinery diagnostics works to identify energy loss from gross misalignment and unbalance of rotating machines in power plants as an example.

Keywords: demand-side management, edge intelligence, energy management system, fault detection and classification

Procedia PDF Downloads 251
290 Optimization of a High-Growth Investment Portfolio for the South African Market Using Predictive Analytics

Authors: Mia Françoise

Abstract:

This report aims to develop a strategy for assisting short-term investors to benefit from the current economic climate in South Africa by utilizing technical analysis techniques and predictive analytics. As part of this research, value investing and technical analysis principles will be combined to maximize returns for South African investors while optimizing volatility. As an emerging market, South Africa offers many opportunities for high growth in sectors where other developed countries cannot grow at the same rate. Investing in South African companies with significant growth potential can be extremely rewarding. Although the risk involved is more significant in countries with less developed markets and infrastructure, there is more room for growth in these countries. According to recent research, the offshore market is expected to outperform the local market over the long term; however, short-term investments in the local market will likely be more profitable, as the Johannesburg Stock Exchange is predicted to outperform the S&P500 over the short term. The instabilities in the economy contribute to increased market volatility, which can benefit investors if appropriately utilized. Price prediction and portfolio optimization comprise the two primary components of this methodology. As part of this process, statistics and other predictive modeling techniques will be used to predict the future performance of stocks listed on the Johannesburg Stock Exchange. Following predictive data analysis, Modern Portfolio Theory, based on Markowitz's Mean-Variance Theorem, will be applied to optimize the allocation of assets within an investment portfolio. By combining different assets within an investment portfolio, this optimization method produces a portfolio with an optimal ratio of expected risk to expected return. This methodology aims to provide a short-term investment with a stock portfolio that offers the best risk-to-return profile for stocks listed on the JSE by combining price prediction and portfolio optimization.

Keywords: financial stocks, optimized asset allocation, prediction modelling, South Africa

Procedia PDF Downloads 99
289 Participatory Culture and Value Perception Amongst the Korean and Chinese Drama International Fandom

Authors: Patricia P. M. C. Lourenco, Javier Bringué Sala, Anaisa D. A. de Sena

Abstract:

Almost everyone in Dramaland knows the names of big Korean stars that grace their computer screens on a roll through social media and video streaming platforms that enable awareness of Korean dramas and lifestyle at a click. A surface culture instilled with notions of belonging has redefined the meaning of friendship and challenged deep inner values. Not everyone, however, knows Chinese Dramas or their stars, which is a consequence of Dramaland's focus on Korean dramas and promoting the Korean experience. Despite a parity in terms of production quality, star power, scripts and compelling visual settings, Chinese Dramas have been playing catch up to their famous counterparts. While they might have a strong competitive soft power for international drama fans, the soft power of Korean dramas is imbued with substantial societal values that they want to share with others. Those values are portrayed in an artistic way that connects with audiences who experience loneliness in the non-virtual world contrary to the way Chinese Dramas are perceived.

Keywords: Chinese dramas, fandom, Korean dramas, participatory culture, value perception, soft power, surface culture

Procedia PDF Downloads 170
288 A 5G Architecture Based to Dynamic Vehicular Clustering Enhancing VoD Services Over Vehicular Ad hoc Networks

Authors: Lamaa Sellami, Bechir Alaya

Abstract:

Nowadays, video-on-demand (VoD) applications are becoming one of the tendencies driving vehicular network users. In this paper, considering the unpredictable vehicle density, the unexpected acceleration or deceleration of the different cars included in the vehicular traffic load, and the limited radio range of the employed communication scheme, we introduce the “Dynamic Vehicular Clustering” (DVC) algorithm as a new scheme for video streaming systems over VANET. The proposed algorithm takes advantage of the concept of small cells and the introduction of wireless backhauls, inspired by the different features and the performance of the Long Term Evolution (LTE)- Advanced network. The proposed clustering algorithm considers multiple characteristics such as the vehicle’s position and acceleration to reduce latency and packet loss. Therefore, each cluster is counted as a small cell containing vehicular nodes and an access point that is elected regarding some particular specifications.

Keywords: video-on-demand, vehicular ad-hoc network, mobility, vehicular traffic load, small cell, wireless backhaul, LTE-advanced, latency, packet loss

Procedia PDF Downloads 142
287 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs

Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili

Abstract:

OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.

Keywords: LWD measurements, caliper log, correlations, analysis

Procedia PDF Downloads 122
286 Kuwait Environmental Remediation Program: Waste Management Data Analytics for Planning and Optimization of Waste Collection

Authors: Aisha Al-Baroud

Abstract:

The United Nations Compensation Commission (UNCC), Kuwait National Focal Point (KNFP) and Kuwait Oil Company (KOC) cooperated in a joint project to undertake comprehensive and collaborative efforts to remediate 26 million m3 of crude oil contaminated soil that had resulted from the Gulf War in 1990/1991. These efforts are referred to as the Kuwait Environmental Remediation Program (KERP). KOC has developed a Total Remediation Solution (TRS) for KERP, which will guide the Remediation projects, comprises of alternative remedial solutions with treatment techniques inclusive of limited landfills for non-treatable soil materials disposal, and relies on treating certain ranges of Total Petroleum Hydrocarbon (TPH) contamination with the most appropriate remediation techniques. The KERP Remediation projects will be implemented within the KOC’s oilfields in North and South East Kuwait. The objectives of this remediation project is to clear land for field development and treat all the oil contaminated features (dry oil lakes, wet oil lakes, and oil contaminated piles) through TRS plan to optimize the treatment processes and minimize the volume of contaminated materials to be placed into landfills. The treatment strategy will comprise of Excavation and Transportation (E&T) of oil contaminated soils from contaminated land to remote treatment areas and to use appropriate remediation technologies or a combination of treatment technologies to achieve remediation target criteria (RTC). KOC has awarded five mega projects to achieve the same and is currently in the execution phase. As a part of the company’s commitment to environment and for the fulfillment of the mandatory HSSEMS procedures, all the Remediation contractors needs to report waste generation data from the various project activities on a monthly basis. Data on waste generation is collected in order to implement cost-efficient and sustainable waste management operations. Data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information for planning and optimization of waste collection and recycling.

Keywords: waste, tencnolgies, KERP, data, soil

Procedia PDF Downloads 113
285 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage

Authors: Andrew Laming, John Hattie, Mark Wilson

Abstract:

Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.  

Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean

Procedia PDF Downloads 68
284 An Adaptive Virtual Desktop Service in Cloud Computing Platform

Authors: Shuen-Tai Wang, Hsi-Ya Chang

Abstract:

Cloud computing is becoming more and more matured over the last few years and consequently the demands for better cloud services is increasing rapidly. One of the research topics to improve cloud services is the desktop computing in virtualized environment. This paper aims at the development of an adaptive virtual desktop service in cloud computing platform based on our previous research on the virtualization technology. We implement cloud virtual desktop and application software streaming technology that make it possible for providing Virtual Desktop as a Service (VDaaS). Given the development of remote desktop virtualization, it allows shifting the user’s desktop from the traditional PC environment to the cloud-enabled environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote desktop service represents the next significant step to the mobile workplace, and it lets users access their desktop environments from virtually anywhere.

Keywords: cloud computing, virtualization, virtual desktop, VDaaS

Procedia PDF Downloads 286
283 Technological Approach in Question Formation for Assessment of Interviewees

Authors: S. Shujan, A. T. Rupasinghe, N. L. Gunawardena

Abstract:

Numerous studies have determined that there is a direct correlation between the successful interviewee and the nonverbal behavioral patterns of that person during the interview. In this study, we focus on formations of interview questions in such a way that, it gets an opportunity for assessing interviewee through the answers using the nonverbal behavioral cues. From all the nonverbal behavioral factors we have identified, in this study priority is given to the ‘facial expression variations’ with the assistance of facial expression analytics tool; this research proposes a novel approach in question formation for the assessment of interviewees in ‘Software Industry’.

Keywords: assessments, hirability, interviews, non-verbal behaviour patterns, question formation

Procedia PDF Downloads 319
282 Wealth Creation and its Externalities: Evaluating Economic Growth and Corporate Social Responsibility

Authors: Zhikang Rong

Abstract:

The 4th industrial revolution has introduced technologies like interconnectivity, machine learning, and real-time big data analytics that improve operations and business efficiency. This paper examines how these advancements have led to a concentration of wealth, specifically among the top 1%, and investigates whether this wealth provides value to society. Through analyzing impacts on employment, productivity, supply-demand dynamics, and potential externalities, it is shown that successful businesspeople, by enhancing productivity and creating jobs, contribute positively to long-term economic growth. Additionally, externalities such as environmental degradation are managed by social entrepreneurship and government policies.

Keywords: wealth creation, employment, productivity, social entrepreneurship

Procedia PDF Downloads 33
281 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 312
280 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 94
279 The Analyzer: Clustering Based System for Improving Business Productivity by Analyzing User Profiles to Enhance Human Computer Interaction

Authors: Dona Shaini Abhilasha Nanayakkara, Kurugamage Jude Pravinda Gregory Perera

Abstract:

E-commerce platforms have revolutionized the shopping experience, offering convenient ways for consumers to make purchases. To improve interactions with customers and optimize marketing strategies, it is essential for businesses to understand user behavior, preferences, and needs on these platforms. This paper focuses on recommending businesses to customize interactions with users based on their behavioral patterns, leveraging data-driven analysis and machine learning techniques. Businesses can improve engagement and boost the adoption of e-commerce platforms by aligning behavioral patterns with user goals of usability and satisfaction. We propose TheAnalyzer, a clustering-based system designed to enhance business productivity by analyzing user-profiles and improving human-computer interaction. The Analyzer seamlessly integrates with business applications, collecting relevant data points based on users' natural interactions without additional burdens such as questionnaires or surveys. It defines five key user analytics as features for its dataset, which are easily captured through users' interactions with e-commerce platforms. This research presents a study demonstrating the successful distinction of users into specific groups based on the five key analytics considered by TheAnalyzer. With the assistance of domain experts, customized business rules can be attached to each group, enabling The Analyzer to influence business applications and provide an enhanced personalized user experience. The outcomes are evaluated quantitatively and qualitatively, demonstrating that utilizing TheAnalyzer’s capabilities can optimize business outcomes, enhance customer satisfaction, and drive sustainable growth. The findings of this research contribute to the advancement of personalized interactions in e-commerce platforms. By leveraging user behavioral patterns and analyzing both new and existing users, businesses can effectively tailor their interactions to improve customer satisfaction, loyalty and ultimately drive sales.

Keywords: data clustering, data standardization, dimensionality reduction, human computer interaction, user profiling

Procedia PDF Downloads 75
278 A Survey on Requirements and Challenges of Internet Protocol Television Service over Software Defined Networking

Authors: Esmeralda Hysenbelliu

Abstract:

Over the last years, the demand for high bandwidth services, such as live (IPTV Service) and on-demand video streaming, steadily and rapidly increased. It has been predicted that video traffic (IPTV, VoD, and WEB TV) will account more than 90% of global Internet Protocol traffic that will cross the globe in 2016. Consequently, the importance and consideration on requirements and challenges of service providers faced today in supporting user’s requests for entertainment video across the various IPTV services through virtualization over Software Defined Networks (SDN), is tremendous in the highest stage of attention. What is necessarily required, is to deliver optimized live and on-demand services like Internet Protocol Service (IPTV Service) with low cost and good quality by strictly fulfill the essential requirements of Clients and ISP’s (Internet Service Provider’s) in the same time. The aim of this study is to present an overview of the important requirements and challenges of IPTV service with two network trends on solving challenges through virtualization (SDN and Network Function Virtualization). This paper provides an overview of researches published in the last five years.

Keywords: challenges, IPTV service, requirements, software defined networking (SDN)

Procedia PDF Downloads 272
277 Coupling of Two Discretization Schemes for the Lattice Boltzmann Equation

Authors: Tobias Horstmann, Thomas Le Garrec, Daniel-Ciprian Mincu, Emmanuel Lévêque

Abstract:

Despite the efficiency and low dissipation of the stream-collide formulation of the Lattice Boltzmann (LB) algorithm, which is nowadays implemented in many commercial LBM solvers, there are certain situations, e.g. mesh transition, in which a classical finite-volume or finite-difference formulation of the LB algorithm still bear advantages. In this paper, we present an algorithm that combines the node-based streaming of the distribution functions with a second-order finite volume discretization of the advection term of the BGK-LB equation on a uniform D2Q9 lattice. It is shown that such a coupling is possible for a multi-domain approach as long as the overlap, or buffer zone, between two domains, is achieved on at least 2Δx. This also implies that a direct coupling (without buffer zone) of a stream-collide and finite-volume LB algorithm on a single grid is not stable. The critical parameter in the coupling is the CFL number equal to 1 that is imposed by the stream-collide algorithm. Nevertheless, an explicit filtering step on the finite-volume domain can stabilize the solution. In a further investigation, we demonstrate how such a coupling can be used for mesh transition, resulting in an intrinsic conservation of mass over the interface.

Keywords: algorithm coupling, finite volume formulation, grid refinement, Lattice Boltzmann method

Procedia PDF Downloads 380