Search results for: data reliability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25779

Search results for: data reliability

22749 Energy Management System and Interactive Functions of Smart Plug for Smart Home

Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya

Abstract:

Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.

Keywords: energy management, load profile, smart plug, wireless sensor network

Procedia PDF Downloads 267
22748 Challenging Role of Talent Management, Career Development and Compensation Management toward Employee Retention and Organizational Performance with Mediating Effect of Employee Motivation in Service Sector of Pakistan

Authors: Muhammad Younas, Sidra Sawati, M. Razzaq Athar

Abstract:

Organizational development history reveals that it has ever been a challenge to identify and fathom the role of talent management, career development and compensation management towards employees’ retention and organizational performance. Organizations strive hard to measure the impact of all those factors which affect employee retention and organizational performance. Researchers have worked in great deal in order to know the relationship of independent variables i.e. Talent Management, Career Development and Compensation Management on dependent variables i.e. Employee Retention and Organizational Performance. Employees adorned with latest skills with long lasting loyalty play a significant role towards successful achievement of short term as well as long term goals of the organizations. Retention of valuable and resourceful employees for a longer time is equally essential for meeting the set goals. The organizations which spend reasonable chunk of their resources for taking such measures that help to retain their employees through talent management and satisfactory career development always enjoy a competitive edge over their competitors. Human resource is regarded as one of the most precious and difficult resource to management. It has its own needs and requirement. It becomes an easy prey to monotony when lacks career development. Wants and aspirations of this resource are seldom met completely but can be managed through career development and compensation management. In this era of competition, organizations have to take viable steps to management their resources especially human resource. Top management and Managers keep on working for an amenable solution in order to address the challenges relating career development and compensation management as their ultimate goal is to ensure the organizational performance on optimum level. The current study was conducted to examine the impact of Talent Management, Career Development and Compensation Management towards Employees Retention and Organizational Performance with mediating effect of Employees Motivation in Service Sector of Pakistan. The current study is based on Resource Based View (RBV) and Ability Motivation Opportunity (AMO) theories. It explains that by increasing internal resources we can manage employee talent, career development through compensation management and employee motivation more effectively. It will result in effective execution of HRM practices for employee retention enabling an organization to achieve and sustain competitive advantage through optimal performance. Data collection was made through a structured questionnaire which was based upon adopted instruments after testing reliability and validity. A total 300 employees of 30 firms in service sector of Pakistan were sampled through non-probability sampling technique. Regression analysis revealed that talent management, career development and compensation management have significant positive impact on employee retention and perceived organizational performance. The results further showed that employee motivation have a significant mediating effect on employee retention and organizational performance. The interpretation of the findings and limitations, theoretical and managerial implications are also discussed.

Keywords: career development, compensation management, employee retention, organizational performance, talent management

Procedia PDF Downloads 314
22747 Optimal and Critical Path Analysis of State Transportation Network Using Neo4J

Authors: Pallavi Bhogaram, Xiaolong Wu, Min He, Onyedikachi Okenwa

Abstract:

A transportation network is a realization of a spatial network, describing a structure which permits either vehicular movement or flow of some commodity. Examples include road networks, railways, air routes, pipelines, and many more. The transportation network plays a vital role in maintaining the vigor of the nation’s economy. Hence, ensuring the network stays resilient all the time, especially in the face of challenges such as heavy traffic loads and large scale natural disasters, is of utmost importance. In this paper, we used the Neo4j application to develop the graph. Neo4j is the world's leading open-source, NoSQL, a native graph database that implements an ACID-compliant transactional backend to applications. The Southern California network model is developed using the Neo4j application and obtained the most critical and optimal nodes and paths in the network using centrality algorithms. The edge betweenness centrality algorithm calculates the critical or optimal paths using Yen's k-shortest paths algorithm, and the node betweenness centrality algorithm calculates the amount of influence a node has over the network. The preliminary study results confirm that the Neo4j application can be a suitable tool to study the important nodes and the critical paths for the major congested metropolitan area.

Keywords: critical path, transportation network, connectivity reliability, network model, Neo4j application, edge betweenness centrality index

Procedia PDF Downloads 131
22746 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series

Procedia PDF Downloads 137
22745 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics

Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee

Abstract:

Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.

Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru

Procedia PDF Downloads 76
22744 Neuro-Connectivity Analysis Using Abide Data in Autism Study

Authors: Dulal Bhaumik, Fei Jie, Runa Bhaumik, Bikas Sinha

Abstract:

Human brain is an amazingly complex network. Aberrant activities in this network can lead to various neurological disorders such as multiple sclerosis, Parkinson’s disease, Alzheimer’s disease and autism. fMRI has emerged as an important tool to delineate the neural networks affected by such diseases, particularly autism. In this paper, we propose mixed-effects models together with an appropriate procedure for controlling false discoveries to detect disrupted connectivities in whole brain studies. Results are illustrated with a large data set known as Autism Brain Imaging Data Exchange or ABIDE which includes 361 subjects from 8 medical centers. We believe that our findings have addressed adequately the small sample inference problem, and thus are more reliable for therapeutic target for intervention. In addition, our result can be used for early detection of subjects who are at high risk of developing neurological disorders.

Keywords: ABIDE, autism spectrum disorder, fMRI, mixed-effects model

Procedia PDF Downloads 280
22743 Environment Situation Analysis of Germany

Authors: K. Y. Chen, H. Chua, C. W. Kan

Abstract:

In this study, we will analyze Germany’s environmental situation such as water and air quality and review its environmental policy. In addition, we will collect the yearly environmental data as well as information concerning public environmental investment. Based on the data collect, we try to find out the relationship between public environmental investment and sustainable development in Germany. In addition, after comparing the trend of environmental quality and situation of environmental policy and investment, we may have some conclusions and learnable aspects to refer to. Based upon the data collected, it was revealed that Germany has established a well-developed institutionalization of environmental education. And the ecological culture at school is dynamic and continuous renewal. The booming of green markets in Germany is a very successful experience for learning. The green market not only creates a number of job opportunities, but also helps the government to improve and protect the environment. Acknowledgement: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: Germany, public environmental investment, environment quality, sustainable development

Procedia PDF Downloads 244
22742 Mapping the Suitable Sites for Food Grain Crops Using Geographical Information System (GIS) and Analytical Hierarchy Process (AHP)

Authors: Md. Monjurul Islam, Tofael Ahamed, Ryozo Noguchi

Abstract:

Progress continues in the fight against hunger, yet an unacceptably large number of people still lack food they need for an active and healthy life. Bangladesh is one of the rising countries in the South-Asia but still lots of people are food insecure. In the last few years, Bangladesh has significant achievements in food grain production but still food security at national to individual levels remain a matter of major concern. Ensuring food security for all is one of the major challenges that Bangladesh faces today, especially production of rice in the flood and poverty prone areas. Northern part is more vulnerable than any other part of Bangladesh. To ensure food security, one of the best way is to increase domestic production. To increase production, it is necessary to secure lands for achieving optimum utilization of resources. One of the measures is to identify the vulnerable and potential areas using Land Suitability Assessment (LSA) to increase rice production in the poverty prone areas. Therefore, the aim of the study was to identify the suitable sites for food grain crop rice production in the poverty prone areas located at the northern part of Bangladesh. Lack of knowledge on the best combination of factors that suit production of rice has contributed to the low production. To fulfill the research objective, a multi-criteria analysis was done and produced a suitable map for crop production with the help of Geographical Information System (GIS) and Analytical Hierarchy Process (AHP). Primary and secondary data were collected from ground truth information and relevant offices. The suitability levels for each factor were ranked based on the structure of FAO land suitability classification as: Currently Not Suitable (N2), Presently Not Suitable (N1), Marginally Suitable (S3), Moderately Suitable (S2) and Highly Suitable (S1). The suitable sites were identified using spatial analysis and compared with the recent raster image from Google Earth Pro® to validate the reliability of suitability analysis. For producing a suitability map for rice farming using GIS and multi-criteria analysis tool, AHP was used to rank the relevant factors, and the resultant weights were used to create the suitability map using weighted sum overlay tool in ArcGIS 10.3®. Then, the suitability map for rice production in the study area was formed. The weighted overly was performed and found that 22.74 % (1337.02 km2) of the study area was highly suitable, while 28.54% (1678.04 km2) was moderately suitable, 14.86% (873.71 km2) was marginally suitable, and 1.19% (69.97 km2) was currently not suitable for rice farming. On the other hand, 32.67% (1920.87 km2) was permanently not suitable which occupied with settlements, rivers, water bodies and forests. This research provided information at local level that could be used by farmers to select suitable fields for rice production, and then it can be applied to other crops. It will also be helpful for the field workers and policy planner who serves in the agricultural sector.

Keywords: AHP, GIS, spatial analysis, land suitability

Procedia PDF Downloads 230
22741 A Prediction Model of Adopting IPTV

Authors: Jeonghwan Jeon

Abstract:

With the advent of IPTV in the fierce competition with existing broadcasting system, it is emerged as an important issue to predict how much the adoption of IPTV service will be. This paper aims to suggest a prediction model for adopting IPTV using classification and Ranking Belief Simplex (CaRBS). A simplex plot method of representing data allows a clear visual representation to the degree of interaction of the support from the variables to the prediction of the objects. CaRBS is applied to the survey data on the IPTV adoption.

Keywords: prediction, adoption, IPTV, CaRBS

Procedia PDF Downloads 408
22740 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 108
22739 Grammatically Coded Corpus of Spoken Lithuanian: Methodology and Development

Authors: L. Kamandulytė-Merfeldienė

Abstract:

The paper deals with the main issues of methodology of the Corpus of Spoken Lithuanian which was started to be developed in 2006. At present, the corpus consists of 300,000 grammatically annotated word forms. The creation of the corpus consists of three main stages: collecting the data, the transcription of the recorded data, and the grammatical annotation. Collecting the data was based on the principles of balance and naturality. The recorded speech was transcribed according to the CHAT requirements of CHILDES. The transcripts were double-checked and annotated grammatically using CHILDES. The development of the Corpus of Spoken Lithuanian has led to the constant increase in studies on spontaneous communication, and various papers have dealt with a distribution of parts of speech, use of different grammatical forms, variation of inflectional paradigms, distribution of fillers, syntactic functions of adjectives, the mean length of utterances.

Keywords: CHILDES, corpus of spoken Lithuanian, grammatical annotation, grammatical disambiguation, lexicon, Lithuanian

Procedia PDF Downloads 229
22738 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 197
22737 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 265
22736 Cadmium Separation from Aqueous Solutions by Natural Biosorbents

Authors: Z. V. P. Murthy, Preeti Arunachalam, Sangeeta Balram

Abstract:

Removal of metal ions from different wastewaters has become important due to their effects on living beings. Cadmium is one of the heavy metals found in different industrial wastewaters. There are many conventional methods available to remove heavy metals from wastewaters like adsorption, membrane separations, precipitation, electrolytic methods, etc. and all of them have their own advantages and disadvantages. The present work deals with the use of natural biosorbents (chitin and chitosan) to separate cadmium ions from aqueous solutions. The adsorption data were fitted with different isotherms and kinetics models. Amongst different adsorption isotherms used to fit the adsorption data, the Freundlich isotherm showed better fits for both the biosorbents. The kinetics data of adsorption of cadmium showed better fit with pseudo-second order model for both the biosorbents. Chitosan, the derivative from chitin, showed better performance than chitin. The separation results are encouraging.

Keywords: chitin, chitosan, cadmium, isotherm, kinetics

Procedia PDF Downloads 403
22735 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar

Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna

Abstract:

This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.

Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation

Procedia PDF Downloads 236
22734 Prevention of Student Radicalism in School through Civic Education

Authors: Triyanto

Abstract:

Radicalism poses a real threat to Indonesia's future. The target of radicalism is the youth of Indonesia. This is proven by the majority of terrorists are young people. Radicalization is not only a repressive act but also requires educational action. One of the educational efforts is civic education. This study discusses the prevention of radicalism for students through civic education and its constraints. This is qualitative research. Data were collected through literature studies, observations and in-depth interviews. Data were validated by triangulation. The sample of this research is 30 high school students in Surakarta. Data were analyzed by the interactive model of analysis from Miles & Huberman. The results show that (1) civic education can be a way of preventing student radicalism in schools in the form of cultivating the values of education through learning in the classroom and outside the classroom; (2) The obstacles encountered include the lack of learning facilities, the limited ability of teachers and the low attention of students to the civic education.

Keywords: prevention, radicalism, senior high school student, civic education

Procedia PDF Downloads 223
22733 Two-Channels Thermal Energy Storage Tank: Experiments and Short-Cut Modelling

Authors: M. Capocelli, A. Caputo, M. De Falco, D. Mazzei, V. Piemonte

Abstract:

This paper presents the experimental results and the related modeling of a thermal energy storage (TES) facility, ideated and realized by ENEA and realizing the thermocline with an innovative geometry. Firstly, the thermal energy exchange model of an equivalent shell & tube heat exchanger is described and tested to reproduce the performance of the spiral exchanger installed in the TES. Through the regression of the experimental data, a first-order thermocline model was also validated to provide an analytical function of the thermocline, useful for the performance evaluation and the comparison with other systems and implementation in simulations of integrated systems (e.g. power plants). The experimental data obtained from the plant start-up and the short-cut modeling of the system can be useful for the process analysis, for the scale-up of the thermal storage system and to investigate the feasibility of its implementation in actual case-studies.

Keywords: CSP plants, thermal energy storage, thermocline, mathematical modelling, experimental data

Procedia PDF Downloads 325
22732 An Approach for Reliably Transforming Habits Towards Environmental Sustainability Behaviors Among Young Adults

Authors: Dike Felix Okechukwu

Abstract:

Studies and reports from authoritative sources such as the Intergovernmental Panel on Climate Change (IPCC) have stated that to effectively solve environmental sustainability challenges such as pollution, inappropriate waste disposal, and unsustainable consumption, there is a need for more research to seek solutions towards environmentally sustainable behavior. However, literature thus far reports only sporadic developments of TL in Environmental Sustainability because there are scarce reports showing the reliable process(es) to produce TL - for sustainability projects or otherwise. Nonetheless, a recently published article demonstrates how TL can be used to help young adults gain transformed mindsets and habits toward environmental sustainability behaviors and practices. This study, however, does not demonstrate, on a repeated basis, the dependability of the method or reliability of the procedures in using its proposed methodology to help young adults achieve transformed habits towards environmental sustainability behaviors, especially in diverse contexts. In this study, it is demonstrated, through repeated measures, a reliable process that can be used to achieve transformations in habits and mindsets toward environmental sustainability behaviors. To achieve this, the design adopted is multiple case studies and a thematic analysis techniques. Five cases in diverse contexts were used to analyze pieces of evidence of Transformative Learning Outcomes toward environmentally sustainable behaviors. Results from the study offer fresh perspectives on a reliable methodology that can be adopted to achieve Transformations in Habits and mindsets toward environmental sustainability behaviors.

Keywords: environmental sustainability, transformative learning, behaviour, learning, education

Procedia PDF Downloads 89
22731 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images

Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge

Abstract:

Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.

Keywords: band selection, fuzzy c-means, k-means, hyperspectral image

Procedia PDF Downloads 398
22730 Development of a Remote Testing System for Performance of Gas Leakage Detectors

Authors: Gyoutae Park, Woosuk Kim, Sangguk Ahn, Seungmo Kim, Minjun Kim, Jinhan Lee, Youngdo Jo, Jongsam Moon, Hiesik Kim

Abstract:

In this research, we designed a remote system to test parameters of gas detectors such as gas concentration and initial response time. This testing system is available to measure two gas instruments simultaneously. First of all, we assembled an experimental jig with a square structure. Those parts are included with a glass flask, two high-quality cameras, and two Ethernet modems for transmitting data. This remote gas detector testing system extracts numerals from videos with continually various gas concentrations while LCDs show photographs from cameras. Extracted numeral data are received to a laptop computer through Ethernet modem. And then, the numerical data with gas concentrations and the measured initial response speeds are recorded and graphed. Our remote testing system will be diversely applied on gas detector’s test and will be certificated in domestic and international countries.

Keywords: gas leak detector, inspection instrument, extracting numerals, concentration

Procedia PDF Downloads 372
22729 Impact of Mathematical Modeling on Mathematics Achievement, Attitude, and Interest of Pre-Service Teachers in Niger State, Nigeria

Authors: Mohammed Abubakar Ndanusa, A. A. Hassan, R. W. Gimba, A. M. Alfa, M. T. Abari

Abstract:

This study investigated the Impact of Mathematical Modeling on Mathematics Achievement, Attitude and Interest of Pre-Service Teachers in Niger States, Nigeria. It was an attempt to ease students’ difficulties in comprehending mathematics. The study used randomized pretest, posttest control group design. Two Colleges of Education were purposively selected from Niger State with a sample size of eighty-four 84 students. Three research instruments used are Mathematical Modeling Achievement Test (MMAT), Attitudes Towards Mathematical Modeling Questionnaire (ATMMQ) and Mathematical Modeling Students Interest Questionnaire (MMSIQ). Pearson Product Moment Correlation (PPMC) formula was used for MMAT and Alpha Cronbach was used for ATMMQ and MMSIQ to determine their reliability coefficient and the values the following values were obtained respectively 0.76, 0.75 and 0.73. Independent t-test statistics was used to test hypothesis One while Mann Whitney U-test was used to test hypothesis Two and Three. Findings revealed that students taught Mathematics using Mathematical Modeling performed better than their counterparts taught using lecture method. However, there was a significant difference in the attitude and interest of pre-service mathematics teachers after being exposed to mathematical modeling. The strategy, therefore, was recommended to be used by Mathematics teachers with a view to improving students’ attitude and interest towards Mathematics. Also, modeling should be taught at NCE level in order to prepare pre-service teachers towards real task in the field of Mathematics.

Keywords: achievement, attitude, interest, mathematical modeling, pre-service teachers

Procedia PDF Downloads 297
22728 The Galactic Magnetic Field in the Light of Starburst-Generated Ultrahigh-Energy Cosmic Rays

Authors: Luis A. Anchordoqui, Jorge F. Soriano, Diego F. Torres

Abstract:

Auger data show evidence for a correlation between ultrahigh-energy cosmic rays (UHECRs) and nearby starburst galaxies. This intriguing correlation is consistent with data collected by the Telescope Array, which have revealed a much more pronounced directional 'hot spot' in arrival directions not far from the starburst galaxy M82. In this work, we assume starbursts are sources of UHECRs, and we investigate the prospects to use the observed distribution of UHECR arrival directions to constrain galactic magnetic field models. We show that if the Telescope Array hot spot indeed originates on M82, UHECR data would place a strong constraint on the turbulent component of the galactic magnetic field.

Keywords: galactic magnetic field, Pierre Auger observatory, telescope array, ultra-high energy cosmic rays

Procedia PDF Downloads 142
22727 Emotion Mining and Attribute Selection for Actionable Recommendations to Improve Customer Satisfaction

Authors: Jaishree Ranganathan, Poonam Rajurkar, Angelina A. Tzacheva, Zbigniew W. Ras

Abstract:

In today’s world, business often depends on the customer feedback and reviews. Sentiment analysis helps identify and extract information about the sentiment or emotion of the of the topic or document. Attribute selection is a challenging problem, especially with large datasets in actionable pattern mining algorithms. Action Rule Mining is one of the methods to discover actionable patterns from data. Action Rules are rules that help describe specific actions to be made in the form of conditions that help achieve the desired outcome. The rules help to change from any undesirable or negative state to a more desirable or positive state. In this paper, we present a Lexicon based weighted scheme approach to identify emotions from customer feedback data in the area of manufacturing business. Also, we use Rough sets and explore the attribute selection method for large scale datasets. Then we apply Actionable pattern mining to extract possible emotion change recommendations. This kind of recommendations help business analyst to improve their customer service which leads to customer satisfaction and increase sales revenue.

Keywords: actionable pattern discovery, attribute selection, business data, data mining, emotion

Procedia PDF Downloads 195
22726 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis

Authors: Shriya Shukla, Lachin Fernando

Abstract:

Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.

Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning

Procedia PDF Downloads 90
22725 Thermal and Geometric Effects on Nonlinear Response of Incompressible Hyperelastic Cylindrical Shells

Authors: Morteza Shayan Arani, Mohammadamin Esmailzadehazimi, Mohammadreza Moeini, Mohammad Toorani, Aouni A. Lakis

Abstract:

This paper investigates the nonlinear response of thin, incompressible, hyperelastic cylindrical shells in the presence of a time-varying temperature field while considering initial geometric imperfections. The governing equations of motion are derived using an improved Donnell's shallow shell theory. The hyperelastic material is modeled using the Mooney-Rivlin model with two parameters, incorporating temperature-dependent terms. The Lagrangian method is applied to obtain the equation of motion. The resulting governing equation is addressed through the Lindstedt-Poincaré and Multiple Scale methods. The linear and nonlinear models presented in this study are verified against existing open literature, demonstrating the accuracy and reliability of the presented model. The study focuses on understanding the influence of temperature variations and geometrical imperfections on the natural frequency and amplitude-frequency response of the systems. Notably, the investigation reveals the coexistence of hardening and softening peaks in the amplitude-frequency response, which vary in magnitude depending on these parameters. Additionally, resonance peaks exhibit changes as a result of temperature and geometric imperfections.

Keywords: hyperelastic material, cylindrical shell, geometrical nonlinearity, material naolinearity, initial geometric imperfection, temperature gradient, hardening and softening

Procedia PDF Downloads 65
22724 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 353
22723 Numerical Solution of Space Fractional Order Linear/Nonlinear Reaction-Advection Diffusion Equation Using Jacobi Polynomial

Authors: Shubham Jaiswal

Abstract:

During modelling of many physical problems and engineering processes, fractional calculus plays an important role. Those are greatly described by fractional differential equations (FDEs). So a reliable and efficient technique to solve such types of FDEs is needed. In this article, a numerical solution of a class of fractional differential equations namely space fractional order reaction-advection dispersion equations subject to initial and boundary conditions is derived. In the proposed approach shifted Jacobi polynomials are used to approximate the solutions together with shifted Jacobi operational matrix of fractional order and spectral collocation method. The main advantage of this approach is that it converts such problems in the systems of algebraic equations which are easier to be solved. The proposed approach is effective to solve the linear as well as non-linear FDEs. To show the reliability, validity and high accuracy of proposed approach, the numerical results of some illustrative examples are reported, which are compared with the existing analytical results already reported in the literature. The error analysis for each case exhibited through graphs and tables confirms the exponential convergence rate of the proposed method.

Keywords: space fractional order linear/nonlinear reaction-advection diffusion equation, shifted Jacobi polynomials, operational matrix, collocation method, Caputo derivative

Procedia PDF Downloads 440
22722 Application of Powder Metallurgy Technologies for Gas Turbine Engine Wheel Production

Authors: Liubov Magerramova, Eugene Kratt, Pavel Presniakov

Abstract:

A detailed analysis has been performed for several schemes of Gas Turbine Wheels production based on additive and powder technologies including metal, ceramic, and stereolithography 3-D printing. During the process of development and debugging of gas turbine engine components, different versions of these components must be manufactured and tested. Cooled blades of the turbine are among of these components. They are usually produced by traditional casting methods. This method requires long and costly design and manufacture of casting molds. Moreover, traditional manufacturing methods limit the design possibilities of complex critical parts of engine, so capabilities of Powder Metallurgy Techniques (PMT) were analyzed to manufacture the turbine wheel with air-cooled blades. PMT dramatically reduce time needed for such production and allow creating new complex design solutions aimed at improving the technical characteristics of the engine: improving fuel efficiency and environmental performance, increasing reliability, and reducing weight. To accelerate and simplify the blades manufacturing process, several options based on additive technologies were used. The options were implemented in the form of various casting equipment for the manufacturing of blades. Methods of powder metallurgy were applied for connecting the blades with the disc. The optimal production scheme and a set of technologies for the manufacturing of blades and turbine wheel and other parts of the engine can be selected on the basis of the options considered.

Keywords: additive technologies, gas turbine engine, powder technology, turbine wheel

Procedia PDF Downloads 312
22721 Estimation of Source Parameters Using Source Parameters Imaging Method From Digitised High Resolution Airborne Magnetic Data of a Basement Complex

Authors: O. T. Oluriz, O. D. Akinyemi, J. A.Olowofela, O. A. Idowu, S. A. Ganiyu

Abstract:

This study was carried out using aeromagnetic data which record variation in the magnitude of the earth magnetic field in order to detect local changes in the properties of the underlying geology. The aeromagnetic data (Sheet No. 261) was acquired from the archives of Nigeria Geological Survey Agency of Nigeria, obtained in 2009. The study present estimation of source parameters within an area of about 3,025 square kilometers on geographic latitude to and longitude to within Ibadan and it’s environs in Oyo State, southwestern Nigeria. The area under study belongs to part of basement complex in southwestern Nigeria. Estimation of source parameters of aeromagnetic data was achieve through the application of source imaging parameters (SPI) techniques that provide delineation, depth, dip contact, susceptibility contrast and mineral potentials of magnetic signatures within the region. The depth to the magnetic sources in the area ranges from 0.675 km to 4.48 km. The estimated depth limit to shallow sources is 0.695 km and depth to deep sources is 4.48 km. The apparent susceptibility values of the entire study area obtained ranges from 0.01 to 0.005 [SI]. This study has shown that the magnetic susceptibility within study area is controlled mainly by super paramagnetic minerals.

Keywords: aeromagnetic, basement complex, meta-sediment, precambrian

Procedia PDF Downloads 426
22720 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 463