Search results for: well data integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26773

Search results for: well data integration

24343 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET

Authors: Tyler T. Procko, Steve Collins

Abstract:

New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.

Keywords: API data access, database, JSON, .NET core, SQL server

Procedia PDF Downloads 66
24342 Blockchain for IoT Security and Privacy in Healthcare Sector

Authors: Umair Shafique, Hafiz Usman Zia, Fiaz Majeed, Samina Naz, Javeria Ahmed, Maleeha Zainab

Abstract:

The Internet of Things (IoT) has become a hot topic for the last couple of years. This innovative technology has shown promising progress in various areas, and the world has witnessed exponential growth in multiple application domains. Researchers are working to investigate its aptitudes to get the best from it by harnessing its true potential. But at the same time, IoT networks open up a new aspect of vulnerability and physical threats to data integrity, privacy, and confidentiality. It's is due to centralized control, data silos approach for handling information, and a lack of standardization in the IoT networks. As we know, blockchain is a new technology that involves creating secure distributed ledgers to store and communicate data. Some of the benefits include resiliency, integrity, anonymity, decentralization, and autonomous control. The potential for blockchain technology to provide the key to managing and controlling IoT has created a new wave of excitement around the idea of putting that data back into the hands of the end-users. In this manuscript, we have proposed a model that combines blockchain and IoT networks to address potential security and privacy issues in the healthcare domain. Then we try to describe various application areas, challenges, and future directions in the healthcare sector where blockchain platforms merge with IoT networks.

Keywords: IoT, blockchain, cryptocurrency, healthcare, consensus, data

Procedia PDF Downloads 180
24341 Enhancing Academic and Social Skills of Elementary School Students with Autism Spectrum Disorder by an Intensive and Comprehensive Teaching Program

Authors: Piyawan Srisuruk, Janya Boonmeeprasert, Romwarin Gamlunglert, Benjamaporn Choikhruea, Ornjira Jaraepram, Jarin Boonsuchat, Sakdadech Singkibud, Kusalaporn Chaiudomsom, Chanatiporn Chonprai, Pornchanaka Tana, Suchat Paholpak

Abstract:

Objective: To develop an Intensive and comprehensive program (ICP) for the Inclusive Class Teacher (ICPICT) to teach elementary students (ES) with ASD in order to enhance the students’ academic and social skills (ASS) and to study the effect of the teaching program. Methods: The purposive sample included 15 Khon Kaen inclusive class teachers and their 15 elementary students. All the students were diagnosed by a child and adolescent psychiatrist to have DSM-5 level 1 ASD. The study tools included 1) an ICP to teach teachers about ASD, a teaching method to enhance academic and social skills for ES with ASD, and an assessment tool to assess the teacher’s knowledge before and after the ICP. 2) an ICPICT to teach ES with ASD to enhance their ASS. The project taught 10 sessions, 3 hours each. The ICPICT had its teaching structure. Teaching media included: pictures, storytelling, songs, and plays. The authors taught and demonstrated to the participant teachers how to teach with the ICPICT until the participants could display the correct teaching method. Then the teachers taught ICPICT at school by themselves 3) an assessment tool to assess the students’ ASS before and after the completion of the study. The ICP to teach the teachers, the ICPICT, and the relevant assessment tools were developed by the authors and were adjusted until consensus agreed as appropriate for researching by 3 curriculum of teaching children with ASD experts. The data were analyzed by descriptive and analytic statistics via SPSS version 26. Results: After the briefing, the teachers increased the mean score, though not with statistical significance, of knowledge of ASD and how to teach ES with ASD on ASS (p = 0.13). Teaching ES with ASD with the ICPICT could increase the mean scores of the students’ skills in learning and expressing social emotions, relationships with a friend, transitioning, and skills in academic function 3.33, 2.27, 2.94, and 3.00 scores (full scores were 18, 12, 15 and 12, Paired T-Test p = 0.007, 0.013, 0.028 and 0.003 respectively). Conclusion: The program to teach academic and social skills simultaneously in an intensive and comprehensive structure could enhance both the academic and social skills of elementary students with ASD. Keywords: Elementary students, autism spectrum, academic skill, social skills, intensive program, comprehensive program, integration.

Keywords: academica and social skills, students with autism, intensive and comprehensive, teaching program

Procedia PDF Downloads 64
24340 Science of Social Work: Recognizing Its Existence as a Scientific Discipline by a Method Triangulation

Authors: Sandra Mendes

Abstract:

Social Work has encountered over time with multivariate requests in the field of its action, provisioning frameworks of knowledge and praxis. Over the years, we have observed a transformation of society and, consequently, of the public who deals with the social work practitioners. Both, training and profession have had need to adapt and readapt the ways of doing, bailing up theories to action, while action unfolds emancipation of new theories. The theoretical questioning of this subject lies on classical authors from social sciences, and contemporary authors of Social Work. In fact, both enhance, in the design of social work, an integration and social cohesion function, creating a culture of action and theory, attributing to its method a relevant function, which shall be promoter of social changes in various dimensions of both individual and collective life, as well as scientific knowledge. On the other hand, it is assumed that Social Work, through its professionalism and through the academy, is now closer to distinguish itself from other Social Sciences as an autonomous scientific field, being, however, in the center of power struggles. This paper seeks to fill the gap in social work literature about the study of the scientific field of this area of knowledge.

Keywords: field theory, knowledge, science, social work

Procedia PDF Downloads 355
24339 Vision-Based Daily Routine Recognition for Healthcare with Transfer Learning

Authors: Bruce X. B. Yu, Yan Liu, Keith C. C. Chan

Abstract:

We propose to record Activities of Daily Living (ADLs) of elderly people using a vision-based system so as to provide better assistive and personalization technologies. Current ADL-related research is based on data collected with help from non-elderly subjects in laboratory environments and the activities performed are predetermined for the sole purpose of data collection. To obtain more realistic datasets for the application, we recorded ADLs for the elderly with data collected from real-world environment involving real elderly subjects. Motivated by the need to collect data for more effective research related to elderly care, we chose to collect data in the room of an elderly person. Specifically, we installed Kinect, a vision-based sensor on the ceiling, to capture the activities that the elderly subject performs in the morning every day. Based on the data, we identified 12 morning activities that the elderly person performs daily. To recognize these activities, we created a HARELCARE framework to investigate into the effectiveness of existing Human Activity Recognition (HAR) algorithms and propose the use of a transfer learning algorithm for HAR. We compared the performance, in terms of accuracy, and training progress. Although the collected dataset is relatively small, the proposed algorithm has a good potential to be applied to all daily routine activities for healthcare purposes such as evidence-based diagnosis and treatment.

Keywords: daily activity recognition, healthcare, IoT sensors, transfer learning

Procedia PDF Downloads 132
24338 Research on Greenway System Planning of Mountainous City: A Case Study of Chengkou County, Chongqing

Authors: Youping Huang, Yang Liu

Abstract:

Mountainous cities have unique landscape relationship, topography and urban spatial pattern different from plain cities, which put forward different requirements for greenway system planning strategy. Taking the greenway planning of Chengkou County in Chongqing as an example, this paper discusses the greenway system planning strategy of mountainous cities based on urban and rural green space, urban landscape resources, human resources and other factors. Through multi-angle maintenance of landscape pattern, multi-objective integration of urban resources, multi-level construction of greenway network, and multi-interactive development control, the sustainable development of mountain city landscape resources is realized, the new urban ecology is constructed, and the quality of life of urban and rural residents is improved.

Keywords: greenway planning, mountain city, landscape pattern, cultural resources, chongqing

Procedia PDF Downloads 102
24337 Proposal for Knowledge-Based Virtual Community System (KBVCS) for Enhancing Knowledge Sharing in Mechatronics System Diagnostic and Repair

Authors: Adetoba B. Tiwalola, Adedeji W. Oyediran, Yekini N. Asafe, Akinwole A. Kikelomo

Abstract:

Mechatronics is synergistic integration of mechanical engineering, with electronics and intelligent computer control in the design and manufacturing of industrial products and processes. Automobile (auto car, motor car or car is a wheeled motor vehicle used for transporting passengers, which also carries its own engine or motor) is a mechatronic system which served as major means of transportation around the world. Virtually all community has a need for automobile. This makes automobile issues as related to diagnostic and repair interesting to all communities. Consequent to the diversification of skill in diagnosing automobile faults and approaches in solving some problems and innovation in automobile industry. It is appropriate to say that repair and diagnostic of automobile will be better enhanced if community has opportunity of sharing knowledge and idea globally. This paper discussed the desirable elements in automobile as mechatronics system and present conceptual framework of virtual community model for knowledge sharing among automobile users.

Keywords: automobile, automobile users, knowledge sharing, mechatronics system, virtual community

Procedia PDF Downloads 440
24336 Design and Implementation of Security Middleware for Data Warehouse Signature, Framework

Authors: Mayada Al Meghari

Abstract:

Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature, DWS Framework. The aim of using the middleware in our DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.

Keywords: middleware, parallel computing, data warehouse, security, group-key, high performance

Procedia PDF Downloads 119
24335 Contactless Heart Rate Measurement System based on FMCW Radar and LSTM for Automotive Applications

Authors: Asma Omri, Iheb Sifaoui, Sofiane Sayahi, Hichem Besbes

Abstract:

Future vehicle systems demand advanced capabilities, notably in-cabin life detection and driver monitoring systems, with a particular emphasis on drowsiness detection. To meet these requirements, several techniques employ artificial intelligence methods based on real-time vital sign measurements. In parallel, Frequency-Modulated Continuous-Wave (FMCW) radar technology has garnered considerable attention in the domains of healthcare and biomedical engineering for non-invasive vital sign monitoring. FMCW radar offers a multitude of advantages, including its non-intrusive nature, continuous monitoring capacity, and its ability to penetrate through clothing. In this paper, we propose a system utilizing the AWR6843AOP radar from Texas Instruments (TI) to extract precise vital sign information. The radar allows us to estimate Ballistocardiogram (BCG) signals, which capture the mechanical movements of the body, particularly the ballistic forces generated by heartbeats and respiration. These signals are rich sources of information about the cardiac cycle, rendering them suitable for heart rate estimation. The process begins with real-time subject positioning, followed by clutter removal, computation of Doppler phase differences, and the use of various filtering methods to accurately capture subtle physiological movements. To address the challenges associated with FMCW radar-based vital sign monitoring, including motion artifacts due to subjects' movement or radar micro-vibrations, Long Short-Term Memory (LSTM) networks are implemented. LSTM's adaptability to different heart rate patterns and ability to handle real-time data make it suitable for continuous monitoring applications. Several crucial steps were taken, including feature extraction (involving amplitude, time intervals, and signal morphology), sequence modeling, heart rate estimation through the analysis of detected cardiac cycles and their temporal relationships, and performance evaluation using metrics such as Root Mean Square Error (RMSE) and correlation with reference heart rate measurements. For dataset construction and LSTM training, a comprehensive data collection system was established, integrating the AWR6843AOP radar, a Heart Rate Belt, and a smart watch for ground truth measurements. Rigorous synchronization of these devices ensured data accuracy. Twenty participants engaged in various scenarios, encompassing indoor and real-world conditions within a moving vehicle equipped with the radar system. Static and dynamic subject’s conditions were considered. The heart rate estimation through LSTM outperforms traditional signal processing techniques that rely on filtering, Fast Fourier Transform (FFT), and thresholding. It delivers an average accuracy of approximately 91% with an RMSE of 1.01 beat per minute (bpm). In conclusion, this paper underscores the promising potential of FMCW radar technology integrated with artificial intelligence algorithms in the context of automotive applications. This innovation not only enhances road safety but also paves the way for its integration into the automotive ecosystem to improve driver well-being and overall vehicular safety.

Keywords: ballistocardiogram, FMCW Radar, vital sign monitoring, LSTM

Procedia PDF Downloads 72
24334 Sentiment Classification of Documents

Authors: Swarnadip Ghosh

Abstract:

Sentiment Analysis is the process of detecting the contextual polarity of text. In other words, it determines whether a piece of writing is positive, negative or neutral.Sentiment analysis of documents holds great importance in today's world, when numerous information is stored in databases and in the world wide web. An efficient algorithm to illicit such information, would be beneficial for social, economic as well as medical purposes. In this project, we have developed an algorithm to classify a document into positive or negative. Using our algorithm, we obtained a feature set from the data, and classified the documents based on this feature set. It is important to note that, in the classification, we have not used the independence assumption, which is considered by many procedures like the Naive Bayes. This makes the algorithm more general in scope. Moreover, because of the sparsity and high dimensionality of such data, we did not use empirical distribution for estimation, but developed a method by finding degree of close clustering of the data points. We have applied our algorithm on a movie review data set obtained from IMDb and obtained satisfactory results.

Keywords: sentiment, Run's Test, cross validation, higher dimensional pmf estimation

Procedia PDF Downloads 402
24333 Corporate Governance and Bank Performance: A Study of Selected Deposit Money Banks in Nigeria

Authors: Ayodele Ajayi, John Ajayi

Abstract:

This paper investigates the effect of corporate governance with a view to determining the relationship between board size and bank performance. Data for the study were obtained from the audited financial statements of five sampled banks listed on the Nigerian Stock Exchange. Panel data technique was adopted and analysis was carried out with the use of multiple regression and pooled ordinary least square. Results from the study show that the larger the board size, the greater the profit implying that corporate governance is positively correlated with bank performance.

Keywords: corporate governance, banks performance, board size, pooled data

Procedia PDF Downloads 360
24332 Empowering a New Frontier in Heart Disease Detection: Unleashing Quantum Machine Learning

Authors: Sadia Nasrin Tisha, Mushfika Sharmin Rahman, Javier Orduz

Abstract:

Machine learning is applied in a variety of fields throughout the world. The healthcare sector has benefited enormously from it. One of the most effective approaches for predicting human heart diseases is to use machine learning applications to classify data and predict the outcome as a classification. However, with the rapid advancement of quantum technology, quantum computing has emerged as a potential game-changer for many applications. Quantum algorithms have the potential to execute substantially faster than their classical equivalents, which can lead to significant improvements in computational performance and efficiency. In this study, we applied quantum machine learning concepts to predict coronary heart diseases from text data. We experimented thrice with three different features; and three feature sets. The data set consisted of 100 data points. We pursue to do a comparative analysis of the two approaches, highlighting the potential benefits of quantum machine learning for predicting heart diseases.

Keywords: quantum machine learning, SVM, QSVM, matrix product state

Procedia PDF Downloads 94
24331 Integration of Fuzzy Logic in the Representation of Knowledge: Application in the Building Domain

Authors: Hafida Bouarfa, Mohamed Abed

Abstract:

The main object of our work is the development and the validation of a system indicated Fuzzy Vulnerability. Fuzzy Vulnerability uses a fuzzy representation in order to tolerate the imprecision during the description of construction. At the the second phase, we evaluated the similarity between the vulnerability of a new construction and those of the whole of the historical cases. This similarity is evaluated on two levels: 1) individual similarity: bases on the fuzzy techniques of aggregation; 2) Global similarity: uses the increasing monotonous linguistic quantifiers (RIM) to combine the various individual similarities between two constructions. The third phase of the process of Fuzzy Vulnerability consists in using vulnerabilities of historical constructions narrowly similar to current construction to deduce its estimate vulnerability. We validated our system by using 50 cases. We evaluated the performances of Fuzzy Vulnerability on the basis of two basic criteria, the precision of the estimates and the tolerance of the imprecision along the process of estimation. The comparison was done with estimates made by tiresome and long models. The results are satisfactory.

Keywords: case based reasoning, fuzzy logic, fuzzy case based reasoning, seismic vulnerability

Procedia PDF Downloads 292
24330 Blockchain’s Feasibility in Military Data Networks

Authors: Brenden M. Shutt, Lubjana Beshaj, Paul L. Goethals, Ambrose Kam

Abstract:

Communication security is of particular interest to military data networks. A relatively novel approach to network security is blockchain, a cryptographically secured distribution ledger with a decentralized consensus mechanism for data transaction processing. Recent advances in blockchain technology have proposed new techniques for both data validation and trust management, as well as different frameworks for managing dataflow. The purpose of this work is to test the feasibility of different blockchain architectures as applied to military command and control networks. Various architectures are tested through discrete-event simulation and the feasibility is determined based upon a blockchain design’s ability to maintain long-term stable performance at industry standards of throughput, network latency, and security. This work proposes a consortium blockchain architecture with a computationally inexpensive consensus mechanism, one that leverages a Proof-of-Identity (PoI) concept and a reputation management mechanism.

Keywords: blockchain, consensus mechanism, discrete-event simulation, fog computing

Procedia PDF Downloads 138
24329 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster

Authors: Trapti Sharma, Devesh Kumar Srivastava

Abstract:

This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.

Keywords: hadoop, mapreduce, k-mediod, validation, verification

Procedia PDF Downloads 369
24328 An Improved K-Means Algorithm for Gene Expression Data Clustering

Authors: Billel Kenidra, Mohamed Benmohammed

Abstract:

Data mining technique used in the field of clustering is a subject of active research and assists in biological pattern recognition and extraction of new knowledge from raw data. Clustering means the act of partitioning an unlabeled dataset into groups of similar objects. Each group, called a cluster, consists of objects that are similar between themselves and dissimilar to objects of other groups. Several clustering methods are based on partitional clustering. This category attempts to directly decompose the dataset into a set of disjoint clusters leading to an integer number of clusters that optimizes a given criterion function. The criterion function may emphasize a local or a global structure of the data, and its optimization is an iterative relocation procedure. The K-Means algorithm is one of the most widely used partitional clustering techniques. Since K-Means is extremely sensitive to the initial choice of centers and a poor choice of centers may lead to a local optimum that is quite inferior to the global optimum, we propose a strategy to initiate K-Means centers. The improved K-Means algorithm is compared with the original K-Means, and the results prove how the efficiency has been significantly improved.

Keywords: microarray data mining, biological pattern recognition, partitional clustering, k-means algorithm, centroid initialization

Procedia PDF Downloads 190
24327 "Revolutionizing Geographic Data: CADmapper's Automated Precision in CAD Drawing Transformation"

Authors: Toleen Alaqqad, Kadi Alshabramiy, Suad Zaafarany, Basma Musallam

Abstract:

CADmapper is a significant tool of software for transforming geographic data into realistic CAD drawings. It speeds up and simplifies the conversion process by automating it. This allows architects, urban planners, engineers, and geographic information system (GIS) experts to solely concentrate on the imaginative and scientific parts of their projects. While the future incorporation of AI has the potential for further improvements, CADmapper's current capabilities make it an indispensable asset in the business. It covers a combination of 2D and 3D city and urban area models. The user can select a specific square section of the map to view, and the fee is based on the dimensions of the area being viewed. The procedure is straightforward: you choose the area you want, then pick whether or not to include topography. 3D architectural data (if available), followed by selecting whatever design program or CAD style you want to publish the document which contains more than 200 free broad town plans in DXF format. If you desire to specify a bespoke area, it's free up to 1 km2.

Keywords: cadmaper, gdata, 2d and 3d data conversion, automated cad drawing, urban planning software

Procedia PDF Downloads 68
24326 Analysis on the Development and Evolution of China’s Territorial Spatial Planning

Authors: He YuanYan

Abstract:

In recent years, China has implemented the reform of land and space planning. As an important public policy, land and space planning plays a vital role in the construction and development of cities. Land and space planning throughout the country is in full swing, but there are still many disputes from all walks of life. The content, scope, and specific implementation process of land and space planning are also ambiguous, leading to the integration of multiple regulation problems such as unclear authority, unclear responsibilities, and poor planning results during the implementation of land and space planning. Therefore, it is necessary to sort out the development and evolution of domestic and foreign land space planning, clarify the problems and cruxes from the current situation of China's land space planning, and sort out the obstacles and countermeasures to the implementation of this policy, so as to deepen the understanding of the connotation of land space planning. It is of great practical significance for all planners to correctly understand and clarify the specific contents and methods of land space planning and to smoothly promote the implementation of land space planning at all levels.

Keywords: territorial spatial planning, public policy, land space, overall planning

Procedia PDF Downloads 131
24325 Low-Noise Amplifier Design for Improvement of Communication Range for Wake-Up Receiver Based Wireless Sensor Network Application

Authors: Ilef Ketata, Mohamed Khalil Baazaoui, Robert Fromm, Ahmad Fakhfakh, Faouzi Derbel

Abstract:

The integration of wireless communication, e. g. in real-or quasi-real-time applications, is related to many challenges such as energy consumption, communication range, latency, quality of service, and reliability. To minimize the latency without increasing energy consumption, wake-up receiver (WuRx) nodes have been introduced in recent works. Low-noise amplifiers (LNAs) are introduced to improve the WuRx sensitivity but increase the supply current severely. Different WuRx approaches exist with always-on, power-gated, or duty-cycled receiver designs. This paper presents a comparative study for improving communication range and decreasing the energy consumption of wireless sensor nodes.

Keywords: wireless sensor network, wake-up receiver, duty-cycled, low-noise amplifier, envelope detector, range study

Procedia PDF Downloads 112
24324 An IoT-Enabled Crop Recommendation System Utilizing Message Queuing Telemetry Transport (MQTT) for Efficient Data Transmission to AI/ML Models

Authors: Prashansa Singh, Rohit Bajaj, Manjot Kaur

Abstract:

In the modern agricultural landscape, precision farming has emerged as a pivotal strategy for enhancing crop yield and optimizing resource utilization. This paper introduces an innovative Crop Recommendation System (CRS) that leverages the Internet of Things (IoT) technology and the Message Queuing Telemetry Transport (MQTT) protocol to collect critical environmental and soil data via sensors deployed across agricultural fields. The system is designed to address the challenges of real-time data acquisition, efficient data transmission, and dynamic crop recommendation through the application of advanced Artificial Intelligence (AI) and Machine Learning (ML) models. The CRS architecture encompasses a network of sensors that continuously monitor environmental parameters such as temperature, humidity, soil moisture, and nutrient levels. This sensor data is then transmitted to a central MQTT server, ensuring reliable and low-latency communication even in bandwidth-constrained scenarios typical of rural agricultural settings. Upon reaching the server, the data is processed and analyzed by AI/ML models trained to correlate specific environmental conditions with optimal crop choices and cultivation practices. These models consider historical crop performance data, current agricultural research, and real-time field conditions to generate tailored crop recommendations. This implementation gets 99% accuracy.

Keywords: Iot, MQTT protocol, machine learning, sensor, publish, subscriber, agriculture, humidity

Procedia PDF Downloads 68
24323 Web Development in Information Technology with Javascript, Machine Learning and Artificial Intelligence

Authors: Abdul Basit Kiani, Maryam Kiani

Abstract:

Online developers now have the tools necessary to create online apps that are not only reliable but also highly interactive, thanks to the introduction of JavaScript frameworks and APIs. The objective is to give a broad overview of the recent advances in the area. The fusion of machine learning (ML) and artificial intelligence (AI) has expanded the possibilities for web development. Modern websites now include chatbots, clever recommendation systems, and customization algorithms built in. In the rapidly evolving landscape of modern websites, it has become increasingly apparent that user engagement and personalization are key factors for success. To meet these demands, websites now incorporate a range of innovative technologies. One such technology is chatbots, which provide users with instant assistance and support, enhancing their overall browsing experience. These intelligent bots are capable of understanding natural language and can answer frequently asked questions, offer product recommendations, and even help with troubleshooting. Moreover, clever recommendation systems have emerged as a powerful tool on modern websites. By analyzing user behavior, preferences, and historical data, these systems can intelligently suggest relevant products, articles, or services tailored to each user's unique interests. This not only saves users valuable time but also increases the chances of conversions and customer satisfaction. Additionally, customization algorithms have revolutionized the way websites interact with users. By leveraging user preferences, browsing history, and demographic information, these algorithms can dynamically adjust the website's layout, content, and functionalities to suit individual user needs. This level of personalization enhances user engagement, boosts conversion rates, and ultimately leads to a more satisfying online experience. In summary, the integration of chatbots, clever recommendation systems, and customization algorithms into modern websites is transforming the way users interact with online platforms. These advanced technologies not only streamline user experiences but also contribute to increased customer satisfaction, improved conversions, and overall website success.

Keywords: Javascript, machine learning, artificial intelligence, web development

Procedia PDF Downloads 80
24322 Extracting Opinions from Big Data of Indonesian Customer Reviews Using Hadoop MapReduce

Authors: Veronica S. Moertini, Vinsensius Kevin, Gede Karya

Abstract:

Customer reviews have been collected by many kinds of e-commerce websites selling products, services, hotel rooms, tickets and so on. Each website collects its own customer reviews. The reviews can be crawled, collected from those websites and stored as big data. Text analysis techniques can be used to analyze that data to produce summarized information, such as customer opinions. Then, these opinions can be published by independent service provider websites and used to help customers in choosing the most suitable products or services. As the opinions are analyzed from big data of reviews originated from many websites, it is expected that the results are more trusted and accurate. Indonesian customers write reviews in Indonesian language, which comes with its own structures and uniqueness. We found that most of the reviews are expressed with “daily language”, which is informal, do not follow the correct grammar, have many abbreviations and slangs or non-formal words. Hadoop is an emerging platform aimed for storing and analyzing big data in distributed systems. A Hadoop cluster consists of master and slave nodes/computers operated in a network. Hadoop comes with distributed file system (HDFS) and MapReduce framework for supporting parallel computation. However, MapReduce has weakness (i.e. inefficient) for iterative computations, specifically, the cost of reading/writing data (I/O cost) is high. Given this fact, we conclude that MapReduce function is best adapted for “one-pass” computation. In this research, we develop an efficient technique for extracting or mining opinions from big data of Indonesian reviews, which is based on MapReduce with one-pass computation. In designing the algorithm, we avoid iterative computation and instead adopt a “look up table” technique. The stages of the proposed technique are: (1) Crawling the data reviews from websites; (2) cleaning and finding root words from the raw reviews; (3) computing the frequency of the meaningful opinion words; (4) analyzing customers sentiments towards defined objects. The experiments for evaluating the performance of the technique were conducted on a Hadoop cluster with 14 slave nodes. The results show that the proposed technique (stage 2 to 4) discovers useful opinions, is capable of processing big data efficiently and scalable.

Keywords: big data analysis, Hadoop MapReduce, analyzing text data, mining Indonesian reviews

Procedia PDF Downloads 201
24321 Assessment of the CSR of Telecom Operators in Cote d’Ivoire

Authors: Odile Amoncou, Djedje-Kossu Zahui

Abstract:

The integration of a Corporate Social Responsibility (CSR) approach within a company appears nowadays as a fundamental system of response to the different problems that threaten our planet. The abusive exploitation of natural resources, social inequalities, discrimination and poverty are some examples. Thus, faced with these different global problems, each company must include in its operating system measures or actions with the aim not only of achieving Sustainable Development Goals (SDGs) but also for the improvement of its performance and its brand internationally. The objective of this article is to assess the implementation of CSR by telecommunication companies. It is our belief that given its high energy consumption and proximity to society, the telecom sector must pay particular attention to environmental and social issues. Our study examines the CSR of three Ivorian telecom operators, namely ORANGE CI, MOOV Africa and MTN, by applying a series of performance indicators related to CSR management. We hope that our study will raise awareness about sustainability issues for all other Ivorian companies but also sub-Sahara African companies in general in order to encourage CEOs to make CSR concept a top priority.

Keywords: CSR, telecom, SDGs, cote d’Ivoire

Procedia PDF Downloads 80
24320 Lanthanide-Mediated Aggregation of Glutathione-Capped Gold Nanoclusters Exhibiting Strong Luminescence and Fluorescence Turn-on for Sensing Alkaline Phosphatase

Authors: Jyun-Guo You, Wei-Lung Tseng

Abstract:

Herein, this study represents a synthetic route for producing highly luminescent AuNCs based on the integration of two concepts, including thiol-induced luminescence enhancement of ligand-insufficient GSH-AuNCs and Ce3+-induced aggregation of GSH-AuNCs. The synthesis of GSH-AuNCs was conducted by modifying the previously reported procedure. To produce more Au(I)-GSH complexes on the surface of ligand-insufficient GSH-AuNCs, the extra GSH is added to attach onto the AuNC surface. The formed ligand-sufficient GSH-AuNCs (LS-GSH-AuNCs) emit relatively strong luminescence. The luminescence of LS-GSH-AuNCs is further enhanced by the coordination of two carboxylic groups (pKa1 = 2 and pKa2 = 3.5) of GSH and lanthanide ions, which induce the self-assembly of LS-GSH-AuNCs. As a result, the quantum yield of the self-assembled LS-GSH-AuNCs (SA-AuNCs) was improved to be 13%. Interestingly, the SA-AuNCs were dissembled into LS-GSH-AuNCs in the presence of adenosine triphosphate (ATP) because of the formation of the ATP- lanthanide ion complexes. Our assay was employed to detect alkaline phosphatase (ALP) activity over the range of 0.1−10 U/mL with a limit of detection (LOD) of 0.03 U/mL.

Keywords: self-assembly, lanthanide ion, adenosine triphosphate, alkaline phosphatase

Procedia PDF Downloads 170
24319 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means

Procedia PDF Downloads 259
24318 Early-Warning Lights Classification Management System for Industrial Parks in Taiwan

Authors: Yu-Min Chang, Kuo-Sheng Tsai, Hung-Te Tsai, Chia-Hsin Li

Abstract:

This paper presents the early-warning lights classification management system for industrial parks promoted by the Taiwan Environmental Protection Administration (EPA) since 2011, including the definition of each early-warning light, objectives, action program and accomplishments. All of the 151 industrial parks in Taiwan were classified into four early-warning lights, including red, orange, yellow and green, for carrying out respective pollution management according to the monitoring data of soil and groundwater quality, regulatory compliance, and regulatory listing of control site or remediation site. The Taiwan EPA set up a priority list for high potential polluted industrial parks and investigated their soil and groundwater qualities based on the results of the light classification and pollution potential assessment. In 2011-2013, there were 44 industrial parks selected and carried out different investigation, such as the early warning groundwater well networks establishment and pollution investigation/verification for the red and orange-light industrial parks and the environmental background survey for the yellow-light industrial parks. Among them, 22 industrial parks were newly or continuously confirmed that the concentrations of pollutants exceeded those in soil or groundwater pollution control standards. Thus, the further investigation, groundwater use restriction, listing of pollution control site or remediation site, and pollutant isolation measures were implemented by the local environmental protection and industry competent authorities; the early warning lights of those industrial parks were proposed to adjust up to orange or red-light. Up to the present, the preliminary positive effect of the soil and groundwater quality management system for industrial parks has been noticed in several aspects, such as environmental background information collection, early warning of pollution risk, pollution investigation and control, information integration and application, and inter-agency collaboration. Finally, the work and goal of self-initiated quality management of industrial parks will be carried out on the basis of the inter-agency collaboration by the classified lights system of early warning and management as well as the regular announcement of the status of each industrial park.

Keywords: industrial park, soil and groundwater quality management, early-warning lights classification, SOP for reporting and treatment of monitored abnormal events

Procedia PDF Downloads 326
24317 Lock in, Lock Out: A Double Lens Analysis of Local Media Paywall Strategies and User Response

Authors: Mona Solvoll, Ragnhild Kr. Olsen

Abstract:

Background and significance of the study: Newspapers are going through radical changes with increased competition, eroding readerships and declining advertising resulting in plummeting overall revenues. This has lead to a quest for new business models, focusing on monetizing content. This research paper investigates both how local online newspapers have introduced user payment and how the audience has received these changes. Given the role of local media in keeping their communities informed and those in power accountable, their potential impact on civic engagement and cultural integration in local communities, the business model innovations of local media deserves far more research interest. Empirically, the findings are interesting for local journalists, local media managers as well as local advertisers. Basic methodologies: The study is based on interviews with commercial leaders in 20 Norwegian local newspapers in addition to a national survey data from 1600 respondents among local media users. The interviews were conducted in the second half of 2015, while the survey was conducted in September 2016. Theoretically, the study draws on the business model framework. Findings: The analysis indicates that paywalls aim more at reducing digital cannibalisation of print revenue than about creating new digital income. The newspapers are mostly concerned with retaining “old” print subscribers and transform them into digital subscribers. However, this strategy may come at a high price for newspapers if their defensive print strategy drives away younger digital readership and hamper their recruitment potential for new audiences as some previous studies have indicated. Analysis of young reader news habits indicates that attracting the younger audience to traditional local news providers is particularly challenging and that they are more prone to seek alternative news sources than the older audience is. Conclusion: The paywall strategy applied by the local newspapers may be well fitted to stabilise print subscription figures and facilitate more tailored and better services for already existing customers, but far less suited for attracting new ones. The paywall is a short-sighted strategy, which drives away younger readers and paves the road for substitute offerings, particularly Facebook.

Keywords: business model, newspapers, paywall, user payment

Procedia PDF Downloads 277
24316 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 40
24315 Quality Assurance for the Climate Data Store

Authors: Judith Klostermann, Miguel Segura, Wilma Jans, Dragana Bojovic, Isadora Christel Jimenez, Francisco Doblas-Reyees, Judit Snethlage

Abstract:

The Climate Data Store (CDS), developed by the Copernicus Climate Change Service (C3S) implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Union, is intended to become a key instrument for exploring climate data. The CDS contains both raw and processed data to provide information to the users about the past, present and future climate of the earth. It allows for easy and free access to climate data and indicators, presenting an important asset for scientists and stakeholders on the path for achieving a more sustainable future. The C3S Evaluation and Quality Control (EQC) is assessing the quality of the CDS by undertaking a comprehensive user requirement assessment to measure the users’ satisfaction. Recommendations will be developed for the improvement and expansion of the CDS datasets and products. User requirements will be identified on the fitness of the datasets, the toolbox, and the overall CDS service. The EQC function of the CDS will help C3S to make the service more robust: integrated by validated data that follows high-quality standards while being user-friendly. This function will be closely developed with the users of the service. Through their feedback, suggestions, and contributions, the CDS can become more accessible and meet the requirements for a diverse range of users. Stakeholders and their active engagement are thus an important aspect of CDS development. This will be achieved with direct interactions with users such as meetings, interviews or workshops as well as different feedback mechanisms like surveys or helpdesk services at the CDS. The results provided by the users will be categorized as a function of CDS products so that their specific interests will be monitored and linked to the right product. Through this procedure, we will identify the requirements and criteria for data and products in order to build the correspondent recommendations for the improvement and expansion of the CDS datasets and products.

Keywords: climate data store, Copernicus, quality, user engagement

Procedia PDF Downloads 146
24314 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand-Side Management: A Systematic Mapping Review

Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring

Abstract:

An electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). Up to the authors' knowledge, there is no systematic mapping review focusing on the utilisation of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorising information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mixing method is much lower than the other techniques, and the proportion of Real-time data (RTD) to non-real-time data (NRTD) is about equal.

Keywords: demand side management, direct load control, electric water heater, indirect load control, non real-time data, real-time data

Procedia PDF Downloads 82