Search results for: data utilization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25829

Search results for: data utilization

25199 Oil Logistics for Refining to Northern Europe

Authors: Vladimir Klepikov

Abstract:

To develop the programs to supply crude oil to North European refineries, it is necessary to take into account the refineries’ location, crude refining capacity, and the transport infrastructure capacity. Among the countries of the region, we include those having a marine boundary along the Northern Sea and the Baltic Sea (from France in the west to Finland in the east). The paper envisages the geographic allocation of the refineries and contains the evaluation of the refineries’ capacities for the region under review. The sustainable operations of refineries in the region are determined by the transportation system capacity to supply crude oil to them. The assessment of capacity of crude oil transportation to the refineries is conducted. The research is performed for the period of 2005/2015, using the quantitative analysis method. The countries are classified by the refineries’ aggregate capacities and the crude oil output on their territory. The crude oil output capacities in the region in the period under review are determined. The capacities of the region’s transportation system to supply crude oil produced in the region to the refineries are revealed. The analysis suggested that imported raw materials are the main source of oil for the refineries in the region. The main sources of crude oil supplies to North European refineries are reviewed. The change in the refineries’ capacities in the group of countries and each particular country, as well as the utilization of the refineries' capacities in the region in the period under review, was studied. The input suggests that the bulk of crude oil is supplied by marine and pipeline transport. The paper contains the assessment of the crude oil transportation by pipeline transport in the overall crude oil cargo flow. The refineries’ production rate for the groups of countries under the review and for each particular country was the subject of study. Our study yielded the trend towards the increase in the crude oil refining at the refineries of the region and reduction in the crude oil output. If this trend persists in the near future, the cargo flow of imported crude oil and the utilization of the North European logistics infrastructure may increase. According to the study, the existing transport infrastructure in the region is able to handle the increasing imported crude oil flow.

Keywords: European region, infrastructure, oil terminal capacity, pipeline capacity, tanker draft

Procedia PDF Downloads 160
25198 AI-Based Autonomous Plant Health Monitoring and Control System with Visual Health-Scoring Models

Authors: Uvais Qidwai, Amor Moursi, Mohamed Tahar, Malek Hamad, Hamad Alansi

Abstract:

This paper focuses on the development and implementation of an advanced plant health monitoring system with an AI backbone and IoT sensory network. Our approach involves addressing the critical environmental factors essential for preserving a plant’s well-being, including air temperature, soil moisture, soil temperature, soil conductivity, pH, water levels, and humidity, as well as the presence of essential nutrients like nitrogen, phosphorus, and potassium. Central to our methodology is the utilization of computer vision technology, particularly a night vision camera. The captured data is then compared against a reference database containing different health statuses. This comparative analysis is implemented using an AI deep learning model, which enables us to generate accurate assessments of plant health status. By combining the AI-based decision-making approach, our system aims to provide precise and timely insights into the overall health and well-being of plants, offering a valuable tool for effective plant care and management.

Keywords: deep learning image model, IoT sensing, cloud-based analysis, remote monitoring app, computer vision, fuzzy control

Procedia PDF Downloads 28
25197 The Batch Method Approach for Adsorption Mechanism Processes of Some Selected Heavy Metal Ions and Methylene Blue by Using Chemically Modified Luffa Cylindrica

Authors: Akanimo Emene, Mark D. Ogden, Robert Edyvean

Abstract:

Adsorption is a low cost, efficient and economically viable wastewater treatment process. Utilization of this treatment process has not been fully applied due to the complex and not fully understood nature of the adsorption system. To optimize its process is to choose a sufficient adsorbent and to study further the experimental parameters that influence the adsorption design system. Chemically modified adsorbent, Luffa cylindrica, was used to adsorb heavy metal ions and an organic pollutant, methylene blue, from aqueous environmental solution at varying experimental conditions. Experimental factors, adsorption time, initial metal ion or organic pollutant concentration, ionic strength, and pH of solution were studied. The experimental data were analyzed with kinetic and isotherm models. The antagonistic effect of the methylene and some heavy metal ions were recorded. An understanding of the use of this treated Luffa cylindrica for the removal of these toxic substances will establish and improve the commercial application of the adsorption process in treatment of contaminated waters.

Keywords: adsorption, heavy metal ions, Luffa cylindrica, wastewater treatment

Procedia PDF Downloads 182
25196 Data Access, AI Intensity, and Scale Advantages

Authors: Chuping Lo

Abstract:

This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.

Keywords: digital intensity, digital divide, international trade, scale of economics

Procedia PDF Downloads 55
25195 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 403
25194 Identity Verification Using k-NN Classifiers and Autistic Genetic Data

Authors: Fuad M. Alkoot

Abstract:

DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN). 

Keywords: biometrics, genetic data, identity verification, k nearest neighbor

Procedia PDF Downloads 243
25193 A Review on Intelligent Systems for Geoscience

Authors: R Palson Kennedy, P.Kiran Sai

Abstract:

This article introduces machine learning (ML) researchers to the hurdles that geoscience problems present, as well as the opportunities for improvement in both ML and geosciences. This article presents a review from the data life cycle perspective to meet that need. Numerous facets of geosciences present unique difficulties for the study of intelligent systems. Geosciences data is notoriously difficult to analyze since it is frequently unpredictable, intermittent, sparse, multi-resolution, and multi-scale. The first half addresses data science’s essential concepts and theoretical underpinnings, while the second section contains key themes and sharing experiences from current publications focused on each stage of the data life cycle. Finally, themes such as open science, smart data, and team science are considered.

Keywords: Data science, intelligent system, machine learning, big data, data life cycle, recent development, geo science

Procedia PDF Downloads 125
25192 Investigations on Utilization of Chrome Sludge, Chemical Industry Waste, in Cement Manufacturing and Its Effect on Clinker Mineralogy

Authors: Suresh Vanguri, Suresh Palla, Prasad G., Ramaswamy V., Kalyani K. V., Chaturvedi S. K., Mohapatra B. N., Sunder Rao TBVN

Abstract:

The utilization of industrial waste materials and by-products in the cement industry helps in the conservation of natural resources besides avoiding the problems arising due to waste dumping. The use of non-carbonated materials as raw mix components in clinker manufacturing is identified as one of the key areas to reduce Green House Gas (GHG) emissions. Chrome sludge is a waste material generated from the manufacturing process of sodium dichromate. This paper aims to present studies on the use of chrome sludge in clinker manufacturing, its impact on the development of clinker mineral phases and on the cement properties. Chrome sludge was found to contain substantial amounts of CaO, Fe2O3 and Al2O3 and therefore was used to replace some conventional sources of alumina and iron in the raw mix. Different mixes were prepared by varying the chrome sludge content from 0 to 5 % and the mixes were evaluated for burnability. Laboratory prepared clinker samples were evaluated for qualitative and quantitative mineralogy using X-ray Diffraction Studies (XRD). Optical microscopy was employed to study the distribution of clinker phases, their granulometry and mineralogy. Since chrome sludge also contains considerable amounts of chromium, studies were conducted on the leachability of heavy elements in the chrome sludge as well as in the resultant cement samples. Estimation of heavy elements, including chromium was carried out using ICP-OES. Further, the state of chromium valence, Cr (III) & Cr (VI), was studied using conventional chemical analysis methods coupled with UV-VIS spectroscopy. Assimilation of chromium in the clinker phases was investigated using SEM-EDXA studies. Bulk cement was prepared from the clinker to study the effect of chromium sludge on the cement properties such as setting time, soundness, strength development against the control cement. Studies indicated that chrome sludge can be successfully utilized and its content needs to be optimized based on raw material characteristics.

Keywords: chrome sludge, leaching, mineralogy, non-carbonate materials

Procedia PDF Downloads 203
25191 Waste Identification Diagrams Effectiveness: A Case Study in the Manaus Industrial Pole

Authors: José Dinis-Carvalho, Levi Guimarães, Celina Leão, Rui Sousa, Rosa Eliza Vieira, Larissa Thomaz, Kelliane Guerreiro

Abstract:

This research paper investigates the efficacy of waste identification diagrams (WIDs) as a tool for waste reduction and management within the Manaus Industrial Pole. The study focuses on assessing the practical application and effectiveness of WIDs in identifying, categorizing, and mitigating various forms of waste generated across industrial processes. Employing a mixed-methods approach, including a qualitative questionnaire applied to 5 companies and quantitative data analysis with SPSS statistical software, the research evaluates the implementation and impact of WIDs on waste reduction practices in select industries within the Manaus Industrial Pole. The findings contribute to understanding the utility of WIDs as a proactive strategy for waste management, offering insights into their potential for fostering sustainable practices and promoting environmental stewardship in industrial settings. The study also discusses challenges, best practices, and recommendations for optimizing the utilization of WIDs in industrial waste management, thereby addressing the broader implications for sustainable industrial development.

Keywords: waste identification diagram, value stream mapping, overall equipment effectiveness, lean manufacturing

Procedia PDF Downloads 36
25190 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 122
25189 Construction Strategy of Urban Public Space in Driverless Era

Authors: Yang Ye, Hongfei Qiu, Yaqi Li

Abstract:

The planning and construction of traditional cities are oriented by cars, which leads to the problems of insufficient urban public space, fragmentation, and low utilization efficiency. With the development of driverless technology, the urban structure will change from the traditional single-core grid structure to the multi-core model. In terms of traffic organization, with the release of land for traffic facilities, public space will become more continuous and integrated with traffic space. In the context of driverless technology, urban public reconstruction is characterized by modularization and high efficiency, and its planning and layout features accord with points (service facilities), lines (smart lines), surfaces (activity centers). The public space of driverless urban roads will provide diversified urban public facilities and services. The intensive urban layout makes the commercial public space realize the functions of central activities and style display, respectively, in the interior (building atrium) and the exterior (building periphery). In addition to recreation function, urban green space can also utilize underground parking space to realize efficient dispatching of shared cars. The roads inside the residential community will be integrated into the urban landscape, providing conditions for the community public activity space with changing time sequence and improving the efficiency of space utilization. The intervention of driverless technology will change the thinking of traditional urban construction and turn it into a human-oriented one. As a result, urban public space will be richer, more connected, more efficient, and the urban space justice will be optimized. By summarizing the frontier research, this paper discusses the impact of unmanned driving on cities, especially urban public space, which is beneficial for landscape architects to cope with the future development and changes of the industry and provides a reference for the related research and practice.

Keywords: driverless, urban public space, construction strategy, urban design

Procedia PDF Downloads 100
25188 Bio-Nano Mask: Antivirus and Antimicrobial Mouth Mask Coating with Nano-TiO2 and Anthocyanin Utilization as an Effective Solution of High ARI Patients in Riau

Authors: Annisa Ulfah Pristya, Andi Setiawan

Abstract:

Indonesia placed in sixth rank total Acute Respiratory Infection (ARI) patient in the world and Riau as one of the province with the highest number of people with respiratory infection in Indonesia reached 37 thousand people. Usually society using a mask as prevention action. Unfortunately the commercial mouth mask only can work maximum for 4 hours and the pores are too large to filter out microorganisms and viruses carried by infectious droplets nucleated 1-5 μm. On the other hand, Indonesia is rich with Titanium dioxide (TiO2) and purple sweet potato anthocyanin pigment. Therefore, offered Bio-nano-mask which is a antimicrobial and antiviral mouth mask with Nano-TiO2 coating and purple sweet potato anthocyanins utilization as an effective solution to high ARI patients in Riau, which has the advantage of the mask surface can’t be attached by infectious droplets, self-cleaning and have anthocyanins biosensors that give visual response can be understood easily by the general public in the form of a mask color change from blue/purple to pink when acid levels increase. Acid level is an indicator of microorganisms accumulation in the mouth and surrounding areas. Bio-nano mask making process begins with the preparation (design, Nano-TiO2 liquid preparation, anthocyanins biosensors manufacture) and then superimposing the Nano-TiO2 on the outer surface of spunbond color using a sprayer, then superimposing anthocyanins biosensors film on the Meltdown surface, making bio nano-mask and it pack. Bio-nano mask has the advantage is effectively preventing pathogenic microorganisms and infectious droplets and has accumulated indicator microorganisms that color changes which easily observed by the common people though.

Keywords: anthocyanins, ARI, nano-TiO2 liquid, self cleaning

Procedia PDF Downloads 554
25187 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 424
25186 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: data augmentation, mutex task generation, meta-learning, text classification.

Procedia PDF Downloads 82
25185 Reliability Enhancement by Parameter Design in Ferrite Magnet Process

Authors: Won Jung, Wan Emri

Abstract:

Ferrite magnet is widely used in many automotive components such as motors and alternators. Magnets used inside the components must be in good quality to ensure the high level of performance. The purpose of this study is to design input parameters that optimize the ferrite magnet production process to ensure the quality and reliability of manufactured products. Design of Experiments (DOE) and Statistical Process Control (SPC) are used as mutual supplementations to optimize the process. DOE and SPC are quality tools being used in the industry to monitor and improve the manufacturing process condition. These tools are practically used to maintain the process on target and within the limits of natural variation. A mixed Taguchi method is utilized for optimization purpose as a part of DOE analysis. SPC with proportion data is applied to assess the output parameters to determine the optimal operating conditions. An example of case involving the monitoring and optimization of ferrite magnet process was presented to demonstrate the effectiveness of this approach. Through the utilization of these tools, reliable magnets can be produced by following the step by step procedures of proposed framework. One of the main contributions of this study was producing the crack free magnets by applying the proposed parameter design.

Keywords: ferrite magnet, crack, reliability, process optimization, Taguchi method

Procedia PDF Downloads 508
25184 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network

Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan

Abstract:

Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.

Keywords: aggregation point, data communication, data aggregation, wireless sensor network

Procedia PDF Downloads 146
25183 Designing a Thermal Management System for Lithium Ion Battery Packs in Electric Vehicles

Authors: Ekin Esen, Mohammad Alipour, Riza Kizilel

Abstract:

Rechargeable lithium-ion batteries have been replacing lead-acid batteries for the last decade due to their outstanding properties such as high energy density, long shelf life, and almost no memory effect. Besides these, being very light compared to lead acid batteries has gained them their dominant place in the portable electronics market, and they are now the leading candidate for electric vehicles (EVs) and hybrid electric vehicles (HEVs). However, their performance strongly depends on temperature, and this causes some inconveniences for their utilization in extreme temperatures. Since weather conditions vary across the globe, this situation limits their utilization for EVs and HEVs and makes a thermal management system obligatory for the battery units. The objective of this study is to understand thermal characteristics of Li-ion battery modules for various operation conditions and design a thermal management system to enhance battery performance in EVs and HEVs. In the first part of our study, we investigated thermal behavior of commercially available pouch type 20Ah LiFePO₄ (LFP) cells under various conditions. Main parameters were chosen as ambient temperature and discharge current rate. Each cell was charged and discharged at temperatures of 0°C, 10°C, 20°C, 30°C, 40°C, and 50°C. The current rate of charging process was 1C while it was 1C, 2C, 3C, 4C, and 5C for discharge process. Temperatures of 7 different points on the cells were measured throughout charging and discharging with N-type thermocouples, and a detailed temperature profile was obtained. In the second part of our study, we connected 4 cells in series by clinching and prepared 4S1P battery modules similar to ones in EVs and HEVs. Three reference points were determined according to the findings of the first part of the study, and a thermocouple is placed on each reference point on the cells composing the 4S1P battery modules. In the end, temperatures of 6 points in the module and 3 points on the top surface were measured and changes in the surface temperatures were recorded for different discharge rates (0.2C, 0.5C, 0.7C, and 1C) at various ambient temperatures (0°C – 50°C). Afterwards, aluminum plates with channels were placed between the cells in the 4S1P battery modules, and temperatures were controlled with airflow. Airflow was provided with a regular compressor, and the effect of flow rate on cell temperature was analyzed. Diameters of the channels were in mm range, and shapes of the channels were determined in order to make the cell temperatures uniform. Results showed that the designed thermal management system could help keeping the cell temperatures in the modules uniform throughout charge and discharge processes. Other than temperature uniformity, the system was also beneficial to keep cell temperature close to the optimum working temperature of Li-ion batteries. It is known that keeping the temperature at an optimum degree and maintaining uniform temperature throughout utilization can help obtaining maximum power from the cells in battery modules for a longer time. Furthermore, it will increase safety by decreasing the risk of thermal runaways. Therefore, the current study is believed to be beneficial for wider use of Li batteries for battery modules of EVs and HEVs globally.

Keywords: lithium ion batteries, thermal management system, electric vehicles, hybrid electric vehicles

Procedia PDF Downloads 151
25182 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data

Procedia PDF Downloads 580
25181 Enhancing Students' Utilization of Written Corrective Feedback through Teacher-Student Writing Conferences: A Case Study in English Writing Instruction

Authors: Tsao Jui-Jung

Abstract:

Previous research findings have shown that most students do not fully utilize the written corrective feedback provided by teachers (Stone, 2014). This common phenomenon results in the ineffective utilization of teachers' written corrective feedback. As Ellis (2010) points out, the effectiveness of written corrective feedback depends on the level of student engagement with it. Therefore, it is crucial to understand how students utilize the written corrective feedback from their teachers. Previous studies have confirmed the positive impact of teacher-student writing conferences on students' engagement in the writing process and their writing abilities (Hum, 2021; Nosratinia & Nikpanjeh, 2019; Wong, 1996; Yeh, 2016, 2019). However, due to practical constraints such as time limitations, this instructional activity is not fully utilized in writing classrooms (Alfalagg, 2020). Therefore, to address this research gap, the purpose of this study was to explore several aspects of teacher-student writing conferences, including the frequency of meaning negotiation (i.e., comprehension checks, confirmation checks, and clarification checks) and teacher scaffolding techniques (i.e., feedback, prompts, guidance, explanations, and demonstrations) in teacher-student writing conferences, examining students’ self-assessment of their writing strengths and weaknesses in post-conference journals and their experiences with teacher-student writing conferences (i.e., interaction styles, communication levels, how teachers addressed errors, and overall perspectives on the conferences), and gathering insights from their responses to open-ended questions in the final stage of the study (i.e., their preferences and reasons for different written corrective feedback techniques used by teachers and their perspectives and suggestions on teacher-student writing conferences). Data collection methods included transcripts of audio recordings of teacher-student writing conferences, students’ post-conference journals, and open-ended questionnaires. The participants of this study were sophomore students enrolled in an English writing course for a duration of one school year. Key research findings are as follows: Firstly, in terms of meaning negotiation, students attempted to clearly understand the corrective feedback provided by the teacher-researcher twice as often as the teacher-researcher attempted to clearly understand the students' writing content. Secondly, the most commonly used scaffolding technique in the conferences was prompting (indirect feedback). Thirdly, the majority of participants believed that teacher-student writing conferences had a positive impact on their writing abilities. Fourthly, most students preferred direct feedback from the teacher-research as it directly pointed out their errors and saved them time in revision. However, some students still preferred indirect feedback, as they believed it encouraged them to think and self-correct. Based on the research findings, this study proposes effective teaching recommendations for English writing instruction aimed at optimizing teaching strategies and enhancing students' writing abilities.

Keywords: written corrective feedback, student engagement, teacher-student writing conferences, action research

Procedia PDF Downloads 61
25180 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 337
25179 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis

Authors: C. B. Le, V. N. Pham

Abstract:

In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.

Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering

Procedia PDF Downloads 173
25178 Implementation of Enterprise Asset Management (E-AM) System at Oman Electricity Transmission Company

Authors: Omran Al Balushi, Haitham Al Rawahi

Abstract:

Enterprise Asset Management (eAM) has been implemented across different Generation, Transmission and Distribution subsidiaries in Nama Group companies. As part of Nama group, Oman Electricity Transmission Company (OETC) was the first company to implement this system. It was very important for OETC to implement and maintain such a system to achieve its business objectives and for effective operations, which will also support the delivery of the asset management strategy. Enterprise Asset Management (eAM) addresses the comprehensive asset maintenance requirements of Oman Electricity Transmission Company (OETC). OETC needs to optimize capacity and increase utilization, while lowering unit production. E-AM will enable OETC to adopt this strategy. Implementation of e-AM has improved operation performance with preventive and scheduled maintenance as well as it increased safety. Implementation of e-AM will also enable OETC to create optimal asset management strategy which will increase revenue and decrease cost by effectively monitoring operational data such as maintenance history and operation conditions. CMMS (Computerised Maintenance Management System) is the main software and the back-bone of e-AM system. It is used to provide an improved working practice to properly establish information and data flow related to maintenance activities. Implementation of e-AM system was one of the factors that supported OETC to achieve ISO55001 Certificate on fourth quarter of 2016. Also, full implementation of e-AM system will result in strong integration between CMMS and Geographical Information Systems (GIS) application and it will improve OETC to build a reliable maintenance strategy for all asset classes in its Transmission network. In this paper we will share our experience and knowledge of implementing such a system and how it supported OETC’s management to make decisions. Also we would highlight the challenges and difficulties that we encountered during the implementation of e-AM. Also, we will list some features and advantages of e-AM in asset management, preventive maintenance and maintenance cost management.

Keywords: CMMS, Maintenance Management, Asset Management, Maintenance Strategy

Procedia PDF Downloads 127
25177 Modeling Activity Pattern Using XGBoost for Mining Smart Card Data

Authors: Eui-Jin Kim, Hasik Lee, Su-Jin Park, Dong-Kyu Kim

Abstract:

Smart-card data are expected to provide information on activity pattern as an alternative to conventional person trip surveys. The focus of this study is to propose a method for training the person trip surveys to supplement the smart-card data that does not contain the purpose of each trip. We selected only available features from smart card data such as spatiotemporal information on the trip and geographic information system (GIS) data near the stations to train the survey data. XGboost, which is state-of-the-art tree-based ensemble classifier, was used to train data from multiple sources. This classifier uses a more regularized model formalization to control the over-fitting and show very fast execution time with well-performance. The validation results showed that proposed method efficiently estimated the trip purpose. GIS data of station and duration of stay at the destination were significant features in modeling trip purpose.

Keywords: activity pattern, data fusion, smart-card, XGboost

Procedia PDF Downloads 231
25176 The Role of Artificial Intelligence Algorithms in Psychiatry: Advancing Diagnosis and Treatment

Authors: Netanel Stern

Abstract:

Artificial intelligence (AI) algorithms have emerged as powerful tools in the field of psychiatry, offering new possibilities for enhancing diagnosis and treatment outcomes. This article explores the utilization of AI algorithms in psychiatry, highlighting their potential to revolutionize patient care. Various AI algorithms, including machine learning, natural language processing (NLP), reinforcement learning, clustering, and Bayesian networks, are discussed in detail. Moreover, ethical considerations and future directions for research and implementation are addressed.

Keywords: AI, software engineering, psychiatry, neuroimaging

Procedia PDF Downloads 92
25175 Efficiency Enhancement in Solar Panel

Authors: R. S. Arun Raj

Abstract:

In today's climate of growing energy needs and increasing environmental issues, alternatives to the use of non-renewable and polluting fossil fuels have to be investigated. One such alternative is the solar energy. The SUN provides every hour as much energy as mankind consumes in one year. This paper clearly explains about the solar panel design and new models and methodologies that can be implemented for better utilization of solar energy. Minimisation of losses in solar panel as heat is my innovative idea revolves around. The pay back calculations by implementation of solar panels is also quoted.

Keywords: on-grid and off-grid systems, pyro-electric effect, pay-back calculations, solar panel

Procedia PDF Downloads 580
25174 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: mutex task generation, data augmentation, meta-learning, text classification.

Procedia PDF Downloads 124
25173 "Gurza Incinerator" : Biomass Incinerator Powered by Empty Bunch of Palm Oil Fruits as Electrical Biomass Base Development

Authors: Andi Ismanto

Abstract:

Indonesia is the largest palm oil producer in the world. The increasing number of palm oil extensification in Indonesia started on 2000-2011. Based on preliminary figures from the Directorate General of Plantation, palm oil area in Indonesia until 2011 is about 8.91 million hectares.On 2011 production of palm oil CPO reaches 22.51 million tons. In the other hands, the increasing palm oil production has impact to environment. The Empty Bunch of Palm Oil (EBPO)waste was increased to 20 million tons in 2009. Utilization of waste EBPO currently only used as an organic fertilizer for plants. But, it was not a good solution, because TKKS that used as organic compost has high content of carbon and hydrogen compound. The EBPO waste has potential used as fuel by gasification because it has short time of decomposition. So, the process will be more efficient in time. Utilization of urban wastehas been created using an incinerator used as a source of electrical energy for household.Usually, waste burning process by incinerator is using diesel fuel and kerosene. It is certainly less effective and not environment friendly, considering the waste incineration process using Incinerator tools are continuously. Considering biomass is a renewable source of energy and the world's energy system must be switch from an energy based on fossil resources into the energy based on renewable resources, the "Gurza Incinerator": Design Build Powerful Biomass Incinerator Empty Bunch of Palm Oil (EBPO) as Elecrical Biomass Base Development, a renewable future technology. The tools is using EBPO waste as source of burning to burn garbage inside the Incinerator hopper. EBPO waste will be processed by means of gasification. Gasification isa process to produce gases that can be used as fuel for electrical power. Hopefully, this technology could be a renewable future energy and also as starting point of electrical biomass base development.

Keywords: incinerator, biomass, empty bunch palm oil, electrical energy

Procedia PDF Downloads 463
25172 Revolutionizing Traditional Farming Using Big Data/Cloud Computing: A Review on Vertical Farming

Authors: Milind Chaudhari, Suhail Balasinor

Abstract:

Due to massive deforestation and an ever-increasing population, the organic content of the soil is depleting at a much faster rate. Due to this, there is a big chance that the entire food production in the world will drop by 40% in the next two decades. Vertical farming can help in aiding food production by leveraging big data and cloud computing to ensure plants are grown naturally by providing the optimum nutrients sunlight by analyzing millions of data points. This paper outlines the most important parameters in vertical farming and how a combination of big data and AI helps in calculating and analyzing these millions of data points. Finally, the paper outlines how different organizations are controlling the indoor environment by leveraging big data in enhancing food quantity and quality.

Keywords: big data, IoT, vertical farming, indoor farming

Procedia PDF Downloads 163
25171 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation

Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan

Abstract:

Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.

Keywords: binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform

Procedia PDF Downloads 294
25170 Data Challenges Facing Implementation of Road Safety Management Systems in Egypt

Authors: A. Anis, W. Bekheet, A. El Hakim

Abstract:

Implementing a Road Safety Management System (SMS) in a crowded developing country such as Egypt is a necessity. Beginning a sustainable SMS requires a comprehensive reliable data system for all information pertinent to road crashes. In this paper, a survey for the available data in Egypt and validating it for using in an SMS in Egypt. The research provides some missing data, and refer to the unavailable data in Egypt, looking forward to the contribution of the scientific society, the authorities, and the public in solving the problem of missing or unreliable crash data. The required data for implementing an SMS in Egypt are divided into three categories; the first is available data such as fatality and injury rates and it is proven in this research that it may be inconsistent and unreliable, the second category of data is not available, but it may be estimated, an example of estimating vehicle cost is available in this research, the third is not available and can be measured case by case such as the functional and geometric properties of a facility. Some inquiries are provided in this research for the scientific society, such as how to improve the links among stakeholders of road safety in order to obtain a consistent, non-biased, and reliable data system.

Keywords: road safety management system, road crash, road fatality, road injury

Procedia PDF Downloads 116