Search results for: consumer data right
22620 Adult Language Learning in the Institute of Technology Sector in the Republic of Ireland
Authors: Una Carthy
Abstract:
A recent study of third level institutions in Ireland reveals that both age and aptitude can be overcome by teaching methodologies to motivate second language learners. This PhD investigation gathered quantitative and qualitative data from 14 Institutes of Technology over a three years period from 2011 to 2014. The fundamental research question was to establish the impact of institutional language policy on attitudes towards language learning. However, other related issues around second language acquisition arose in the course of the investigation. Data were collected from both lectures and students, allowing interesting points of comparison to emerge from both datasets. Negative perceptions among lecturers regarding language provision were often associated with the view that language learning belongs to primary and secondary level and has no place in third level education. This perception was offset by substantial data showing positive attitudes towards adult language learning. Lenneberg’s Critical Age Theory postulated that the optimum age for learning a second language is before puberty. More recently, scholars have challenged this theory in their studies, revealing that mature learners can and do succeed at learning languages. With regard to aptitude, a preoccupation among lecturers regarding poor literacy skills among students emerged and was often associated with resistance to second language acquisition. This was offset by a preponderance of qualitative data from students highlighting the crucial role which teaching approaches play in the learning process. Interestingly, the data collected regarding learning disabilities reveals that, given the appropriate learning environments, individuals can be motivated to acquire second languages, and indeed succeed at learning them. These findings are in keeping with other recent studies regarding attitudes towards second language learning among students with learning disabilities. Both sets of findings reinforce the case for language policies in the Institute of Technology (IoTs). Supportive and positive learning environments can be created in third level institutions to motivate adult learners, thereby overcoming perceived obstacles relating to age and aptitude.Keywords: age, aptitude, second language acquisition, teaching methodologies
Procedia PDF Downloads 12322619 Cloud Monitoring and Performance Optimization Ensuring High Availability
Authors: Inayat Ur Rehman, Georgia Sakellari
Abstract:
Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.Keywords: cloud computing, cloud monitoring, performance optimization, high availability, scalability, resource allocation, load balancing, auto-scaling, data security, data privacy
Procedia PDF Downloads 6022618 The Use of Artificial Intelligence to Curb Corruption in Brazil
Authors: Camila Penido Gomes
Abstract:
Over the past decade, an emerging body of research has been pointing to artificial intelligence´s great potential to improve the use of open data, increase transparency and curb corruption in the public sector. Nonetheless, studies on this subject are scant and usually lack evidence to validate AI-based technologies´ effectiveness in addressing corruption, especially in developing countries. Aiming to fill this void in the literature, this paper sets out to examine how AI has been deployed by civil society to improve the use of open data and prevent congresspeople from misusing public resources in Brazil. Building on the current debates and carrying out a systematic literature review and extensive document analyses, this research reveals that AI should not be deployed as one silver bullet to fight corruption. Instead, this technology is more powerful when adopted by a multidisciplinary team as a civic tool in conjunction with other strategies. This study makes considerable contributions, bringing to the forefront discussion a more accurate understanding of the factors that play a decisive role in the successful implementation of AI-based technologies in anti-corruption efforts.Keywords: artificial intelligence, civil society organization, corruption, open data, transparency
Procedia PDF Downloads 20522617 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach
Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak
Abstract:
Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity
Procedia PDF Downloads 16122616 Relational Attention Shift on Images Using Bu-Td Architecture and Sequential Structure Revealing
Authors: Alona Faktor
Abstract:
In this work, we present a NN-based computational model that can perform attention shifts according to high-level instruction. The instruction specifies the type of attentional shift using explicit geometrical relation. The instruction also can be of cognitive nature, specifying more complex human-human interaction or human-object interaction, or object-object interaction. Applying this approach sequentially allows obtaining a structural description of an image. A novel data-set of interacting humans and objects is constructed using a computer graphics engine. Using this data, we perform systematic research of relational segmentation shifts.Keywords: cognitive science, attentin, deep learning, generalization
Procedia PDF Downloads 19822615 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture
Authors: Sajjad Akbar, Rabia Bashir
Abstract:
With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.Keywords: agent based web content mining, content centric networking, information centric networking
Procedia PDF Downloads 47522614 Control Mechanisms for Sprayer Used in Turkey
Authors: Huseyin Duran, Yesim Benal Oztekin, Kazim Kubilay Vursavus, Ilker Huseyin Celen
Abstract:
There are two main approaches to manufacturing, market and usage of plant protection machinery in Turkey. The first approach is called as ‘Product Safety Approach’ and could be summarized as minimum health and safety requirements of consumer needs on plant protection equipment and machinery products. The second approach is the practices related to the Plant Protection Equipment and Machinery Directive. Product safety approach covers the plant protection machinery product groups within the framework of a new approach directive, Machinery Safety Directive (2006/42 / AT). The new directive is in practice in our country by 03.03.2009, parallel to the revision of the EU Regulation on the Directive (03.03.2009 dated and numbered 27158 published in the Official Gazette). ‘Pesticide Application for Machines’ paragraph is added to the 2006/42 / EC Machinery Safety Directive, which is, in particular, reveals the importance of primary health care and product safety issue, explaining the safety requirements for machines used in the application of plant protection products. The Ministry of Science, Industry and Technology is the authorized organizations in our country for the publication and implementation of this regulation. There is a special regulation, carried out by Ministry of Food, Agriculture and Livestock General Directorate of Food and Control, on the manufacture and sale of plant protection machinery. This regulation, prepared based on 5996 Veterinary Services, Plant Health, Food and Feed Law, is ‘Regulation on Plant Protection Equipment and Machinery’ (published on 02.04.2011 whit number 27893 in the Official Gazette). The purposes of this regulation are practicing healthy and reliable crop production, the preparation, implementation and dissemination of the integrated pest management programs and projects for the development of human health and environmentally friendly pest control methods. This second regulation covers: approval, manufacturing, licensing of Plant Protection Equipment and Machinery; duties and responsibilities of the dealers; principles and procedures related to supply and control of the market. There are no inspection procedures for the application of currently used plant protection machinery in Turkey. In this study, content and application principles of all regulation approaches currently used in Turkey are summarized.Keywords: plant protection equipment and machinery, product safety, market surveillance, inspection procedures
Procedia PDF Downloads 25922613 One-Class Classification Approach Using Fukunaga-Koontz Transform and Selective Multiple Kernel Learning
Authors: Abdullah Bal
Abstract:
This paper presents a one-class classification (OCC) technique based on Fukunaga-Koontz Transform (FKT) for binary classification problems. The FKT is originally a powerful tool to feature selection and ordering for two-class problems. To utilize the standard FKT for data domain description problem (i.e., one-class classification), in this paper, a set of non-class samples which exist outside of positive class (target class) describing boundary formed with limited training data has been constructed synthetically. The tunnel-like decision boundary around upper and lower border of target class samples has been designed using statistical properties of feature vectors belonging to the training data. To capture higher order of statistics of data and increase discrimination ability, the proposed method, termed one-class FKT (OC-FKT), has been extended to its nonlinear version via kernel machines and referred as OC-KFKT for short. Multiple kernel learning (MKL) is a favorable family of machine learning such that tries to find an optimal combination of a set of sub-kernels to achieve a better result. However, the discriminative ability of some of the base kernels may be low and the OC-KFKT designed by this type of kernels leads to unsatisfactory classification performance. To address this problem, the quality of sub-kernels should be evaluated, and the weak kernels must be discarded before the final decision making process. MKL/OC-FKT and selective MKL/OC-FKT frameworks have been designed stimulated by ensemble learning (EL) to weight and then select the sub-classifiers using the discriminability and diversities measured by eigenvalue ratios. The eigenvalue ratios have been assessed based on their regions on the FKT subspaces. The comparative experiments, performed on various low and high dimensional data, against state-of-the-art algorithms confirm the effectiveness of our techniques, especially in case of small sample size (SSS) conditions.Keywords: ensemble methods, fukunaga-koontz transform, kernel-based methods, multiple kernel learning, one-class classification
Procedia PDF Downloads 2122612 A Simple Algorithm for Real-Time 3D Capturing of an Interior Scene Using a Linear Voxel Octree and a Floating Origin Camera
Authors: Vangelis Drosos, Dimitrios Tsoukalos, Dimitrios Tsolis
Abstract:
We present a simple algorithm for capturing a 3D scene (focused on the usage of mobile device cameras in the context of augmented/mixed reality) by using a floating origin camera solution and storing the resulting information in a linear voxel octree. Data is derived from cloud points captured by a mobile device camera. For the purposes of this paper, we assume a scene of fixed size (known to us or determined beforehand) and a fixed voxel resolution. The resulting data is stored in a linear voxel octree using a hashtable. We commence by briefly discussing the logic behind floating origin approaches and the usage of linear voxel octrees for efficient storage. Following that, we present the algorithm for translating captured feature points into voxel data in the context of a fixed origin world and storing them. Finally, we discuss potential applications and areas of future development and improvement to the efficiency of our solution.Keywords: voxel, octree, computer vision, XR, floating origin
Procedia PDF Downloads 13322611 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution
Authors: Masomeh Jamshid Nejad
Abstract:
Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.Keywords: statistics, excel-based instruction, data visualization, pedagogy
Procedia PDF Downloads 5322610 Novel Recommender Systems Using Hybrid CF and Social Network Information
Authors: Kyoung-Jae Kim
Abstract:
Collaborative Filtering (CF) is a popular technique for the personalization in the E-commerce domain to reduce information overload. In general, CF provides recommending items list based on other similar users’ preferences from the user-item matrix and predicts the focal user’s preference for particular items by using them. Many recommender systems in real-world use CF techniques because it’s excellent accuracy and robustness. However, it has some limitations including sparsity problems and complex dimensionality in a user-item matrix. In addition, traditional CF does not consider the emotional interaction between users. In this study, we propose recommender systems using social network and singular value decomposition (SVD) to alleviate some limitations. The purpose of this study is to reduce the dimensionality of data set using SVD and to improve the performance of CF by using emotional information from social network data of the focal user. In this study, we test the usability of hybrid CF, SVD and social network information model using the real-world data. The experimental results show that the proposed model outperforms conventional CF models.Keywords: recommender systems, collaborative filtering, social network information, singular value decomposition
Procedia PDF Downloads 28922609 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach
Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik
Abstract:
Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data
Procedia PDF Downloads 35022608 Text Emotion Recognition by Multi-Head Attention based Bidirectional LSTM Utilizing Multi-Level Classification
Authors: Vishwanath Pethri Kamath, Jayantha Gowda Sarapanahalli, Vishal Mishra, Siddhesh Balwant Bandgar
Abstract:
Recognition of emotional information is essential in any form of communication. Growing HCI (Human-Computer Interaction) in recent times indicates the importance of understanding of emotions expressed and becomes crucial for improving the system or the interaction itself. In this research work, textual data for emotion recognition is used. The text being the least expressive amongst the multimodal resources poses various challenges such as contextual information and also sequential nature of the language construction. In this research work, the proposal is made for a neural architecture to resolve not less than 8 emotions from textual data sources derived from multiple datasets using google pre-trained word2vec word embeddings and a Multi-head attention-based bidirectional LSTM model with a one-vs-all Multi-Level Classification. The emotions targeted in this research are Anger, Disgust, Fear, Guilt, Joy, Sadness, Shame, and Surprise. Textual data from multiple datasets were used for this research work such as ISEAR, Go Emotions, Affect datasets for creating the emotions’ dataset. Data samples overlap or conflicts were considered with careful preprocessing. Our results show a significant improvement with the modeling architecture and as good as 10 points improvement in recognizing some emotions.Keywords: text emotion recognition, bidirectional LSTM, multi-head attention, multi-level classification, google word2vec word embeddings
Procedia PDF Downloads 17422607 The Mass Attenuation Coefficients, Effective Atomic Cross Sections, Effective Atomic Numbers and Electron Densities of Some Halides
Authors: Shivalinge Gowda
Abstract:
The total mass attenuation coefficients m/r, of some halides such as, NaCl, KCl, CuCl, NaBr, KBr, RbCl, AgCl, NaI, KI, AgBr, CsI, HgCl2, CdI2 and HgI2 were determined at photon energies 279.2, 320.07, 514.0, 661.6, 1115.5, 1173.2 and 1332.5 keV in a well-collimated narrow beam good geometry set-up using a high resolution, hyper pure germanium detector. The mass attenuation coefficients and the effective atomic cross sections are found to be in good agreement with the XCOM values. From these mass attenuation coefficients, the effective atomic cross sections sa, of the compounds were determined. These effective atomic cross section sa data so obtained are then used to compute the effective atomic numbers Zeff. For this, the interpolation of total attenuation cross-sections of photons of energy E in elements of atomic number Z was performed by using the logarithmic regression analysis of the data measured by the authors and reported earlier for the above said energies along with XCOM data for standard energies. The best-fit coefficients in the photon energy range of 250 to 350 keV, 350 to 500 keV, 500 to 700 keV, 700 to 1000 keV and 1000 to 1500 keV by a piecewise interpolation method were then used to find the Zeff of the compounds with respect to the effective atomic cross section sa from the relation obtained by piece wise interpolation method. Using these Zeff values, the electron densities Nel of halides were also determined. The present Zeff and Nel values of halides are found to be in good agreement with the values calculated from XCOM data and other available published values.Keywords: mass attenuation coefficient, atomic cross-section, effective atomic number, electron density
Procedia PDF Downloads 37722606 Proposing Smart Clothing for Addressing Criminal Acts Against Women in South Africa
Authors: Anne Mastamet-Mason
Abstract:
Crimes against women is a global concern, and South Africa, in particular, is in a dilemma of dealing with constant criminal acts that face the country. Debates on violence against women in South Africa cannot be overemphasised any longer as crimes continue to rise year by year. The recent death of a university student at the University of Cape Town, as well as many other cases, continues to strengthen the need to find solutions from all the spheres of South African society. The advanced textiles market contains a high number and variety of technologies, many of which have protected status and constitute a relatively small portion of the textiles used for the consumer market. Examples of advanced textiles include nanomaterials, such as silver, titanium dioxide and zinc oxide, designed to create an anti-microbial and self-cleaning layer on top of the fibers, thereby reducing body smell and soiling. Smart textiles propose materials and fabrics versatile and adaptive to different situations and functions. Integrating textiles and computing technologies offer an opportunity to come up with differentiated characteristics and functionality. This paper presents a proposal to design a smart camisole/Yoga sports brazier and a smart Yoga sports pant garment to be worn by women while alone and while in purported danger zones. The smart garments are to be worn under normal clothing and cannot be detected or seen, or suspected by perpetrators. The garments are imbued with devices to sense any physical aggression and any abnormal or accelerated heartbeat that may be exhibited by the victim of violence. The signals created during the attack can be transmitted to the police and family members who own a mobile application system that accepts signals emitted. The signals direct the receiver to the exact location of the offence, and the victim can be rescued before major violations are committed. The design of the Yoga sports garments will be done by Professor Mason, who is a fashion designer by profession, while the mobile phone application system will be developed by Mr. Amos Yegon, who is an independent software developer.Keywords: smart clothing, wearable technology, south africa, 4th industrial revolution
Procedia PDF Downloads 20722605 Developing a Culturally Acceptable End of Life Survey (the VOICES-ESRD/Thai Questionnaire) for Evaluation Health Services Provision of Older Persons with End-Stage Renal Disease (ESRD) in Thailand
Authors: W. Pungchompoo, A. Richardson, L. Brindle
Abstract:
Background: The developing of a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire) is an essential instrument for evaluation health services provision of older persons with ESRD in Thailand. The focus of the questionnaire was on symptoms, symptom control and the health care needs of older people with ESRD who are managed without dialysis. Objective: The objective of this study was to develop and adapt VOICES to make it suitable for use in a population survey in Thailand. Methods: The mixed methods exploratory sequential design was focussed on modifying an instrument. Data collection: A cognitive interviewing technique was implemented, using two cycles of data collection with a sample of 10 bereaved carers and a prototype of the Thai VOICES questionnaire. Qualitative study was used to modify the developing a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire). Data analysis: The data were analysed by using content analysis. Results: The revisions to the prototype questionnaire were made. The results were used to adapt the VOICES questionnaire for use in a population-based survey with older ESRD patients in Thailand. Conclusions: A culturally specific questionnaire was generated during this second phase and issues with questionnaire design were rectified.Keywords: VOICES-ESRD/Thai questionnaire, cognitive interviewing, end of life survey, health services provision, older persons with ESRD
Procedia PDF Downloads 28622604 Combined Power Supply at Well Drilling in Extreme Climate Conditions
Authors: V. Morenov, E. Leusheva
Abstract:
Power supplying of well drilling on oil and gas fields at ambient air low temperatures is characterized by increased requirements of electric and heat energy. Power costs for heating of production facilities, technological and living objects may several times exceed drilling equipment electric power consumption. Power supplying of prospecting and exploitation drilling objects is usually done by means of local electric power structures based on diesel power stations. In the meantime, exploitation of oil fields is accompanied by vast quantities of extracted associated petroleum gas, and while developing gas fields there are considerable amounts of natural gas and gas condensate. In this regard implementation of gas-powered self-sufficient power units functioning on produced crude products for power supplying is seen as most potential. For these purposes gas turbines (GT) or gas reciprocating engines (GRE) may be used. In addition gas-powered units are most efficiently used in cogeneration mode - combined heat and power production. Conducted research revealed that GT generate more heat than GRE while producing electricity. One of the latest GT design are microturbines (MT) - devices that may be efficiently exploited in combined heat and power mode. In conditions of ambient air low temperatures and high velocity wind sufficient heat supplying is required for both technological process, specifically for drilling mud heating, and for maintaining comfortable working conditions at the rig. One of the main heat regime parameters are the heat losses. Due to structural peculiarities of the rig most of the heat losses occur at cold air infiltration through the technological apertures and hatchways and heat transition of isolation constructions. Also significant amount of heat is required for working temperature sustaining of the drilling mud. Violation of circulation thermal regime may lead to ice build-up on well surfaces and ice blockages in armature elements. That is why it is important to ensure heating of the drilling mud chamber according to ambient air temperature. Needed heat power will be defined by heat losses of the chamber. Noting heat power required for drilling structure functioning, it is possible to create combined heat and power complex based on MT for satisfying consumer power needs and at the same time lowering power generation costs. As a result, combined power supplying scheme for multiple well drilling utilizing heat of MT flue gases was developed.Keywords: combined heat, combined power, drilling, electric supply, gas-powered units, heat supply
Procedia PDF Downloads 57722603 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers
Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice
Abstract:
In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection
Procedia PDF Downloads 44622602 Applying Pre-Accident Observational Methods for Accident Assessment and Prediction at Intersections in Norrkoping City in Sweden
Authors: Ghazwan Al-Haji, Adeyemi Adedokun
Abstract:
Traffic safety at intersections is highly represented, given the fact that accidents occur randomly in time and space. It is necessary to judge whether the intersection is dangerous or not based on short-term observations, and not waiting for many years of assessing historical accident data. There are active and pro-active road infrastructure safety methods for assessing safety at intersections. This study aims to investigate the use of quantitative and qualitative pre-observational methods as the best practice for accident prediction, future black spot identification, and treatment. Historical accident data from STRADA (the Swedish Traffic Accident Data Acquisition) was used within Norrkoping city in Sweden. The ADT (Average Daily Traffic), capacity and speed were used to predict accident rates. Locations with the highest accident records and predicted accident counts were identified and hence audited qualitatively by using Street Audit. The results from these quantitative and qualitative methods were analyzed, validated and compared. The paper provides recommendations on the used methods as well as on how to reduce the accident occurrence at the chosen intersections.Keywords: intersections, traffic conflict, traffic safety, street audit, accidents predictions
Procedia PDF Downloads 23322601 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 6322600 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant
Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan
Abstract:
The most important process of the water treatment plant process is the coagulation using alum and poly aluminum chloride (PACL), and the value of usage per day is a hundred thousand baht. Therefore, determining the dosage of alum and PACL are the most important factors to be prescribed. Water production is economical and valuable. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for prediction chemical dose used to coagulation such as alum and PACL, which input data consists of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of Bangkhen water treatment plant (BKWTP) Metropolitan Waterworks Authority. The data collected from 1 January 2019 to 31 December 2019 cover changing seasons of Thailand. The input data of ANN is divided into three groups training set, test set, and validation set, which the best model performance with a coefficient of determination and mean absolute error of alum are 0.73, 3.18, and PACL is 0.59, 3.21 respectively.Keywords: soft jar test, jar test, water treatment plant process, artificial neural network
Procedia PDF Downloads 16622599 The Effect of Green Power Trading Mechanism on Interregional Power Generation and Transmission in China
Authors: Yan-Shen Yang, Bai-Chen Xie
Abstract:
Background and significance of the study: Both green power trading schemes and interregional power transmission are effective ways to increase green power absorption and achieve renewable power development goals. China accelerates the construction of interregional power transmission lines and the green power market. A critical issue focusing on the close interaction between these two approaches arises, which can heavily affect the green power quota allocation and renewable power development. Existing studies have not discussed this issue adequately, so it is urgent to figure out their relationship to achieve a suitable power market design and a more reasonable power grid construction.Basic methodologies: We develop an equilibrium model of the power market in China to analyze the coupling effect of these two approaches as well as their influence on power generation and interregional transmission in China. Our model considers both the Tradable green certificate (TGC) and green power market, which consists of producers, consumers in the market, and an independent system operator (ISO) minimizing the total system cost. Our equilibrium model includes the decision optimization process of each participant. To reformulate the models presented as a single-level one, we replace the producer, consumer, ISO, and market equilibrium problems with their Karush-Kuhn-Tucker (KKT) conditions, which is further reformulated as a mixed-integer linear programming (MILP) and solved in Gurobi solver. Major findings: The result shows that: (1) the green power market can significantly promote renewable power absorption while the TGC market provides a more flexible way for green power trading. (2) The phenomena of inefficient occupation and no available transmission lines appear simultaneously. The existing interregional transmission lines cannot fully meet the demand for wind and solar PV power trading in some areas while the situation is vice versa in other areas. (3) Synchronous implementation of green power and TGC trading mechanism can benefit the development of green power as well as interregional power transmission. (4) The green power transaction exacerbates the unfair distribution of carbon emissions. The Carbon Gini Coefficient is up to 0.323 under the green power market which shows a high Carbon inequality. The eastern coastal region will benefit the most due to its huge demand for external power.Keywords: green power market, tradable green certificate, interregional power transmission, power market equilibrium model
Procedia PDF Downloads 14722598 Drought Detection and Water Stress Impact on Vegetation Cover Sustainability Using Radar Data
Authors: E. Farg, M. M. El-Sharkawy, M. S. Mostafa, S. M. Arafat
Abstract:
Mapping water stress provides important baseline data for sustainable agriculture. Recent developments in the new Sentinel-1 data which allow the acquisition of high resolution images and varied polarization capabilities. This study was conducted to detect and quantify vegetation water content from canopy backscatter for extracting spatial information to encourage drought mapping activities throughout new reclaimed sandy soils in western Nile delta, Egypt. The performance of radar imagery in agriculture strongly depends on the sensor polarization capability. The dual mode capabilities of Sentinel-1 improve the ability to detect water stress and the backscatter from the structure components improves the identification and separation of vegetation types with various canopy structures from other features. The fieldwork data allowed identifying of water stress zones based on land cover structure; those classes were used for producing harmonious water stress map. The used analysis techniques and results show high capability of active sensors data in water stress mapping and monitoring especially when integrated with multi-spectral medium resolution images. Also sub soil drip irrigation systems cropped areas have lower drought and water stress than center pivot sprinkler irrigation systems. That refers to high level of evaporation from soil surface in initial growth stages. Results show that high relationship between vegetation indices such as Normalized Difference Vegetation Index NDVI the observed radar backscattering. In addition to observational evidence showed that the radar backscatter is highly sensitive to vegetation water stress, and essentially potential to monitor and detect vegetative cover drought.Keywords: canopy backscatter, drought, polarization, NDVI
Procedia PDF Downloads 14522597 New Technique of Estimation of Charge Carrier Density of Nanomaterials from Thermionic Emission Data
Authors: Dilip K. De, Olukunle C. Olawole, Emmanuel S. Joel, Moses Emetere
Abstract:
A good number of electronic properties such as electrical and thermal conductivities depend on charge carrier densities of nanomaterials. By controlling the charge carrier densities during the fabrication (or growth) processes, the physical properties can be tuned. In this paper, we discuss a new technique of estimating the charge carrier densities of nanomaterials from the thermionic emission data using the newly modified Richardson-Dushman equation. We find that the technique yields excellent results for graphene and carbon nanotube.Keywords: charge carrier density, nano materials, new technique, thermionic emission
Procedia PDF Downloads 32022596 Web-Based Tools to Increase Public Understanding of Nuclear Technology and Food Irradiation
Authors: Denise Levy, Anna Lucia C. H. Villavicencio
Abstract:
Food irradiation is a processing and preservation technique to eliminate insects and parasites and reduce disease-causing microorganisms. Moreover, the process helps to inhibit sprouting and delay ripening, extending fresh fruits and vegetables shelf-life. Nevertheless, most Brazilian consumers seem to misunderstand the difference between irradiated food and radioactive food and the general public has major concerns about the negative health effects and environmental contamination. Society´s judgment and decision making are directly linked to perceived benefits and risks. The web-based project entitled ‘Scientific information about food irradiation: Internet as a tool to approach science and society’ was created by the Nuclear and Energetic Research Institute (IPEN), in order to offer an interdisciplinary approach to science education, integrating economic, ethical, social and political aspects of food irradiation. This project takes into account that, misinformation and unfounded preconceived ideas impact heavily on the acceptance of irradiated food and purchase intention by the Brazilian consumer. Taking advantage of the potential value of the Internet to enhance communication and education among general public, a research study was carried out regarding the possibilities and trends of Information and Communication Technologies among the Brazilian population. The content includes concepts, definitions and Frequently Asked Questions (FAQ) about processes, safety, advantages, limitations and the possibilities of food irradiation, including health issues, as well as its impacts on the environment. The project counts on eight self-instructional interactive web courses, situating scientific content in relevant social contexts in order to encourage self-learning and further reflections. Communication is a must to improve public understanding of science. The use of information technology for quality scientific divulgation shall contribute greatly to provide information throughout the country, spreading information to as many people as possible, minimizing geographic distances and stimulating communication and development.Keywords: food irradiation, multimedia learning tools, nuclear science, society and education
Procedia PDF Downloads 24822595 Field Environment Sensing and Modeling for Pears towards Precision Agriculture
Authors: Tatsuya Yamazaki, Kazuya Miyakawa, Tomohiko Sugiyama, Toshitaka Iwatani
Abstract:
The introduction of sensor technologies into agriculture is a necessary step to realize Precision Agriculture. Although sensing methodologies themselves have been prevailing owing to miniaturization and reduction in costs of sensors, there are some difficulties to analyze and understand the sensing data. Targeting at pears ’Le Lectier’, which is particular to Niigata in Japan, cultivation environmental data have been collected at pear fields by eight sorts of sensors: field temperature, field humidity, rain gauge, soil water potential, soil temperature, soil moisture, inner-bag temperature, and inner-bag humidity sensors. With regard to the inner-bag temperature and humidity sensors, they are used to measure the environment inside the fruit bag used for pre-harvest bagging of pears. In this experiment, three kinds of fruit bags were used for the pre-harvest bagging. After over 100 days continuous measurement, volumes of sensing data have been collected. Firstly, correlation analysis among sensing data measured by respective sensors reveals that one sensor can replace another sensor so that more efficient and cost-saving sensing systems can be proposed to pear farmers. Secondly, differences in characteristic and performance of the three kinds of fruit bags are clarified by the measurement results by the inner-bag environmental sensing. It is found that characteristic and performance of the inner-bags significantly differ from each other by statistical analysis. Lastly, a relational model between the sensing data and the pear outlook quality is established by use of Structural Equation Model (SEM). Here, the pear outlook quality is related with existence of stain, blob, scratch, and so on caused by physiological impair or diseases. Conceptually SEM is a combination of exploratory factor analysis and multiple regression. By using SEM, a model is constructed to connect independent and dependent variables. The proposed SEM model relates the measured sensing data and the pear outlook quality determined on the basis of farmer judgement. In particularly, it is found that the inner-bag humidity variable relatively affects the pear outlook quality. Therefore, inner-bag humidity sensing might help the farmers to control the pear outlook quality. These results are supported by a large quantity of inner-bag humidity data measured over the years 2014, 2015, and 2016. The experimental and analytical results in this research contribute to spreading Precision Agriculture technologies among the farmers growing ’Le Lectier’.Keywords: precision agriculture, pre-harvest bagging, sensor fusion, structural equation model
Procedia PDF Downloads 31422594 Reviewing Image Recognition and Anomaly Detection Methods Utilizing GANs
Authors: Agastya Pratap Singh
Abstract:
This review paper examines the emerging applications of generative adversarial networks (GANs) in the fields of image recognition and anomaly detection. With the rapid growth of digital image data, the need for efficient and accurate methodologies to identify and classify images has become increasingly critical. GANs, known for their ability to generate realistic data, have gained significant attention for their potential to enhance traditional image recognition systems and improve anomaly detection performance. The paper systematically analyzes various GAN architectures and their modifications tailored for image recognition tasks, highlighting their strengths and limitations. Additionally, it delves into the effectiveness of GANs in detecting anomalies in diverse datasets, including medical imaging, industrial inspection, and surveillance. The review also discusses the challenges faced in training GANs, such as mode collapse and stability issues, and presents recent advancements aimed at overcoming these obstacles.Keywords: generative adversarial networks, image recognition, anomaly detection, synthetic data generation, deep learning, computer vision, unsupervised learning, pattern recognition, model evaluation, machine learning applications
Procedia PDF Downloads 2522593 Use of Artificial Neural Networks to Estimate Evapotranspiration for Efficient Irrigation Management
Authors: Adriana Postal, Silvio C. Sampaio, Marcio A. Villas Boas, Josué P. Castro
Abstract:
This study deals with the estimation of reference evapotranspiration (ET₀) in an agricultural context, focusing on efficient irrigation management to meet the growing interest in the sustainable management of water resources. Given the importance of water in agriculture and its scarcity in many regions, efficient use of this resource is essential to ensure food security and environmental sustainability. The methodology used involved the application of artificial intelligence techniques, specifically Multilayer Perceptron (MLP) Artificial Neural Networks (ANNs), to predict ET₀ in the state of Paraná, Brazil. The models were trained and validated with meteorological data from the Brazilian National Institute of Meteorology (INMET), together with data obtained from a producer's weather station in the western region of Paraná. Two optimizers (SGD and Adam) and different meteorological variables, such as temperature, humidity, solar radiation, and wind speed, were explored as inputs to the models. Nineteen configurations with different input variables were tested; amidst them, configuration 9, with 8 input variables, was identified as the most efficient of all. Configuration 10, with 4 input variables, was considered the most effective, considering the smallest number of variables. The main conclusions of this study show that MLP ANNs are capable of accurately estimating ET₀, providing a valuable tool for irrigation management in agriculture. Both configurations (9 and 10) showed promising performance in predicting ET₀. The validation of the models with cultivator data underlined the practical relevance of these tools and confirmed their generalization ability for different field conditions. The results of the statistical metrics, including Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and Coefficient of Determination (R²), showed excellent agreement between the model predictions and the observed data, with MAE as low as 0.01 mm/day and 0.03 mm/day, respectively. In addition, the models achieved an R² between 0.99 and 1, indicating a satisfactory fit to the real data. This agreement was also confirmed by the Kolmogorov-Smirnov test, which evaluates the agreement of the predictions with the statistical behavior of the real data and yields values between 0.02 and 0.04 for the producer data. In addition, the results of this study suggest that the developed technique can be applied to other locations by using specific data from these sites to further improve ET₀ predictions and thus contribute to sustainable irrigation management in different agricultural regions. The study has some limitations, such as the use of a single ANN architecture and two optimizers, the validation with data from only one producer, and the possible underestimation of the influence of seasonality and local climate variability. An irrigation management application using the most efficient models from this study is already under development. Future research can explore different ANN architectures and optimization techniques, validate models with data from multiple producers and regions, and investigate the model's response to different seasonal and climatic conditions.Keywords: agricultural technology, neural networks in agriculture, water efficiency, water use optimization
Procedia PDF Downloads 4822592 Need of Trained Clinical Research Professionals Globally to Conduct Clinical Trials
Authors: Tambe Daniel Atem
Abstract:
Background: Clinical Research is an organized research on human beings intended to provide adequate information on the drug use as a therapeutic agent on its safety and efficacy. The significance of the study is to educate the global health and life science graduates in Clinical Research in depth to perform better as it involves testing drugs on human beings. Objectives: to provide an overall understanding of the scientific approach to the evaluation of new and existing medical interventions and to apply ethical and regulatory principles appropriate to any individual research. Methodology: It is based on – Primary data analysis and Secondary data analysis. Primary data analysis: means the collection of data from journals, the internet, and other online sources. Secondary data analysis: a survey was conducted with a questionnaire to interview the Clinical Research Professionals to understand the need of training to perform clinical trials globally. The questionnaire consisted details of the professionals working with the expertise. It also included the areas of clinical research which needed intense training before entering into hardcore clinical research domain. Results: The Clinical Trials market worldwide worth over USD 26 billion and the industry has employed an estimated 2,10,000 people in the US and over 70,000 in the U.K, and they form one-third of the total research and development staff. There are more than 2,50,000 vacant positions globally with salary variations in the regions for a Clinical Research Coordinator. R&D cost on new drug development is estimated at US$ 70-85 billion. The cost of doing clinical trials for a new drug is US$ 200-250 million. Due to an increase trained Clinical Research Professionals India has emerged as a global hub for clinical research. The Global Clinical Trial outsourcing opportunity in India in the pharmaceutical industry increased to more than $2 billion in 2014 due to increased outsourcing from U.S and Europe to India. Conclusion: Assessment of training need is recommended for newer Clinical Research Professionals and trial sites, especially prior the conduct of larger confirmatory clinical trials.Keywords: clinical research, clinical trials, clinical research professionals
Procedia PDF Downloads 45222591 Healthy Nutrition Within Institutions
Authors: Khalil Boukfoussa
Abstract:
It is important to provide students with food that contains complete nutrients to provide them with mental and physical energy during the school day. Especially since the time students spend in school is equivalent to 50% of their time during the day, which increases the importance of proper nutrition in schools and makes it an ideal way to inculcate the foundations of a healthy lifestyle and healthy eating habits. Proper nutrition is one of the most important things that affect the health and process of growth and development in children, in addition to being a key factor in supporting the ability to focus, supporting mental abilities and developing the student’s academic achievement. In addition to the importance of a healthy diet for the development and growth of the child's body, proper nutrition can significantly contribute to protecting the body from catching viruses and helping it to pass the winter safely. Effective food control systems in different countries are essential to protect the health and safety of domestic consumers. These systems are also crucial in enabling countries to ensure the safety and quality of food entering international trade and to ensure that imported food conforms to national requirements. The current global food trade environment places significant obligations on both importing and exporting countries to strengthen their food control systems and to apply and implement risk-based food control strategiesConsumers are becoming more interested in the way food is produced, processed and marketed, and are increasingly demanding that governments assume greater responsibility for consumer protection and food safety. In many countries, food control is weak because of the abundance of legislation, the multiplicity of jurisdictions and weaknesses in control, monitoring and enforcement. The following guidelines seek to advise national authorities on strategies to strengthen food control systems to protect public health, prevent fraud and fraud, avoid food contamination and help facilitate trade. These Guidelines will assist authorities in selecting the most appropriate food control system options in terms of legislation, infrastructure and enforcement mechanisms. The document clarifies the broad principles that govern food control systems and provides examples of the infrastructure and methods by which national systems can operateKeywords: food, nutrision, school, safty
Procedia PDF Downloads 69