Search results for: proposed module
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9459

Search results for: proposed module

2469 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications

Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison

Abstract:

In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.

Keywords: economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller

Procedia PDF Downloads 238
2468 Flow: A Fourth Musical Element

Authors: James R. Wilson

Abstract:

Music is typically defined as having the attributes of melody, harmony, and rhythm. In this paper, a fourth element is proposed -"flow". "Flow" is a new dimension in music that has always been present but only recently identified and measured. The Adagio "Flow Machine" enables us to envision this component and even suggests a new approach to music theory and analysis. The Adagio was created specifically to measure the underlying “flow” in music. The Adagio is an entirely new way to experience and visualize the music, to assist in performing music (both as a conductor and/or performer), and to provide a whole new methodology for music analysis and theory. The Adagio utilizes musical “hit points”, such as a transition from one musical section to another (for example, in a musical composition utilizing the sonata form, a transition from the exposition to the development section) to help define the compositions flow rate. Once the flow rate is established, the Adagio can be used to determine if the composer/performer/conductor has correctly maintained the proper rate of flow throughout the performance. An example is provided using Mozart’s Piano Concerto Number 21. Working with the Adagio yielded an unexpected windfall; it was determined via an empirical study conducted at Nova University’s Biofeedback Lab that watching the Adagio helped volunteers participating in a controlled experiment recover from stressors significantly faster than the control group. The Adagio can be thought of as a new arrow in the Musicologist's quiver. It provides a new, unique way of viewing the psychological impact and esthetic effectiveness of music composition. Additionally, with the current worldwide access to multi-media via the internet, flow analysis can be performed and shared with others with little time and/or expense.

Keywords: musicology, music analysis, music flow, music therapy

Procedia PDF Downloads 176
2467 Defect Classification of Hydrogen Fuel Pressure Vessels using Deep Learning

Authors: Dongju Kim, Youngjoo Suh, Hyojin Kim, Gyeongyeong Kim

Abstract:

Acoustic Emission Testing (AET) is widely used to test the structural integrity of an operational hydrogen storage container, and clustering algorithms are frequently used in pattern recognition methods to interpret AET results. However, the interpretation of AET results can vary from user to user as the tuning of the relevant parameters relies on the user's experience and knowledge of AET. Therefore, it is necessary to use a deep learning model to identify patterns in acoustic emission (AE) signal data that can be used to classify defects instead. In this paper, a deep learning-based model for classifying the types of defects in hydrogen storage tanks, using AE sensor waveforms, is proposed. As hydrogen storage tanks are commonly constructed using carbon fiber reinforced polymer composite (CFRP), a defect classification dataset is collected through a tensile test on a specimen of CFRP with an AE sensor attached. The performance of the classification model, using one-dimensional convolutional neural network (1-D CNN) and synthetic minority oversampling technique (SMOTE) data augmentation, achieved 91.09% accuracy for each defect. It is expected that the deep learning classification model in this paper, used with AET, will help in evaluating the operational safety of hydrogen storage containers.

Keywords: acoustic emission testing, carbon fiber reinforced polymer composite, one-dimensional convolutional neural network, smote data augmentation

Procedia PDF Downloads 93
2466 Post Injury Experiences of New Immigrant Workers

Authors: Janki Shankar, Shu Ping Chen

Abstract:

Background: New immigrants are one of most vulnerable sections of the Canadian society. Unable to gain entry into Canada’s strictly regulated professions and trades, several skilled and qualified new immigrants take up precarious jobs without adequate occupational health and safety training, thereby increasing their risk of sustaining occupational injury and illness compared to Canadian born workers. Access to timely and appropriate support is critical for injured new immigrant workers who face additional challenges compared to Canadian born workers in accessing information and support post-injury. The purpose of our study was to explore the post-injury experiences and support needs of new immigrant workers who have sustained work-related injuries. Methods: Using an interpretive research approach and semi structured face to face qualitative interviews, 27 new immigrant workers from a range of industries operating in two cities in a province in Canada were interviewed. All had sustained work-related injuries and reported these to their work supervisors. A constant comparative approach was used to identify key themes across the worker experiences. Results: Findings reveal several factors that can shape the experiences of new immigrant workers and influence their return-to-work outcomes. Conclusion: Based on the insights of study participants, policies, practices, and potential interventions informed by their needs and preferences are proposed that can improve return to work outcomes for these workers.

Keywords: new immigrant workers, post-injury experiences, return to work outcomes, qualified

Procedia PDF Downloads 100
2465 Reducing Uncertainty in Climate Projections over Uganda by Numerical Models Using Bias Correction

Authors: Isaac Mugume

Abstract:

Since the beginning of the 21st century, climate change has been an issue due to the reported rise in global temperature and changes in the frequency as well as severity of extreme weather and climatic events. The changing climate has been attributed to rising concentrations of greenhouse gases, including environmental changes such as ecosystems and land-uses. Climatic projections have been carried out under the auspices of the intergovernmental panel on climate change where a couple of models have been run to inform us about the likelihood of future climates. Since one of the major forcings informing the changing climate is emission of greenhouse gases, different scenarios have been proposed and future climates for different periods presented. The global climate models project different areas to experience different impacts. While regional modeling is being carried out for high impact studies, bias correction is less documented. Yet, the regional climate models suffer bias which introduces uncertainty. This is addressed in this study by bias correcting the regional models. This study uses the Weather Research and Forecasting model under different representative concentration pathways and correcting the products of these models using observed climatic data. This study notes that bias correction (e.g., the running-mean bias correction; the best easy systematic estimator method; the simple linear regression method, nearest neighborhood, weighted mean) improves the climatic projection skill and therefore reduce the uncertainty inherent in the climatic projections.

Keywords: bias correction, climatic projections, numerical models, representative concentration pathways

Procedia PDF Downloads 117
2464 PaSA: A Dataset for Patent Sentiment Analysis to Highlight Patent Paragraphs

Authors: Renukswamy Chikkamath, Vishvapalsinhji Ramsinh Parmar, Christoph Hewel, Markus Endres

Abstract:

Given a patent document, identifying distinct semantic annotations is an interesting research aspect. Text annotation helps the patent practitioners such as examiners and patent attorneys to quickly identify the key arguments of any invention, successively providing a timely marking of a patent text. In the process of manual patent analysis, to attain better readability, recognising the semantic information by marking paragraphs is in practice. This semantic annotation process is laborious and time-consuming. To alleviate such a problem, we proposed a dataset to train machine learning algorithms to automate the highlighting process. The contributions of this work are: i) we developed a multi-class dataset of size 150k samples by traversing USPTO patents over a decade, ii) articulated statistics and distributions of data using imperative exploratory data analysis, iii) baseline Machine Learning models are developed to utilize the dataset to address patent paragraph highlighting task, and iv) future path to extend this work using Deep Learning and domain-specific pre-trained language models to develop a tool to highlight is provided. This work assists patent practitioners in highlighting semantic information automatically and aids in creating a sustainable and efficient patent analysis using the aptitude of machine learning.

Keywords: machine learning, patents, patent sentiment analysis, patent information retrieval

Procedia PDF Downloads 88
2463 Caged in Concrete Jungles: Reasserting Cultural Identity and Environmental Sustainability through Material Choice and Design Expression in Architecture

Authors: Ikenna Michael Onuorah

Abstract:

The relentless march of globalization in architecture has led to a homogenization of built environments, often characterized by an overreliance on imported, resource-intensive materials and a disregard for local cultural contexts. This research posits that such practices pose significant environmental and cultural perils, trapping communities in "caged concrete jungles" devoid of both ecological sustainability and a meaningful connection to their heritage. Through a mixed-method approach encompassing quantitative and qualitative data analysis, the study investigated the impacts of neglecting local materials and cultural expression in architectural design. The research is anticipated to yield significant insights into the multifaceted consequences of neglecting locally available materials and cultural expression in architecture. It creates a compelling case for reasserting local materials and cultural expression in architectural design. Based on the anticipated research findings, the study proposed series of actionable recommendations for architects, policymakers, and communities to promote sustainable and culturally sensitive built environments. This will serve as a wake-up call, urging architects, policymakers, and communities to break free from the confines of "caged concrete jungles" and embrace a more sustainable and culturally sensitive approach to design.

Keywords: sustainability, cultural identity, building materials, sustainable dsigns

Procedia PDF Downloads 54
2462 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data

Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.

Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query

Procedia PDF Downloads 160
2461 Text as Reader Device Improving Subjectivity on the Role of Attestation between Interpretative Semiotics and Discursive Linguistics

Authors: Marco Castagna

Abstract:

Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.

Keywords: attestation, meaning, reader, text

Procedia PDF Downloads 236
2460 Simplifying Seismic Vulnerability Analysis for Existing Reinforced Concrete Buildings

Authors: Maryam Solgi, Behzad Shahmohammadi, Morteza Raissi Dehkordi

Abstract:

One of the main steps for seismic retrofitting of buildings is to determine the vulnerability of structures. While current procedures for evaluating existing buildings are complicated, and there is no limitation between short, middle-high, and tall buildings. This research utilizes a simplified method for assessing structures, which is adequate for existing reinforced concrete buildings. To approach this aim, Simple Lateral Mechanisms Analysis (SLaMA) procedure proposed by NZSEE (New Zealand Society for Earthquake Engineering) has been carried out. In this study, three RC moment-resisting frame buildings are determined. First, these buildings have been evaluated by inelastic static procedure (Pushover) based on acceptance criteria. Then, Park-Ang Damage Index is determined for the whole members of each building by Inelastic Time History Analysis. Next, the Simple Lateral Mechanisms Analysis procedure, a hand method, is carried out to define the capacity of structures. Ultimately, existing procedures are compared with Peak Ground Acceleration caused to fail (PGAfail). The results of this comparison emphasize that the Pushover procedure and SLaMA method define a greater value of PGAfail than the Park-Ang Damage model.

Keywords: peak ground acceleration caused to fail, reinforced concrete moment-frame buildings, seismic vulnerability analysis, simple lateral mechanisms analysis

Procedia PDF Downloads 91
2459 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area

Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras

Abstract:

Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.

Keywords: empirical models, RMSE, shear wave velocity, standard penetration test

Procedia PDF Downloads 337
2458 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 176
2457 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 83
2456 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 488
2455 The Relationship between Coping Styles and Internet Addiction among High School Students

Authors: Adil Kaval, Digdem Muge Siyez

Abstract:

With the negative effects of internet use in a person's life, the use of the Internet has become an issue. This subject was mostly considered as internet addiction, and it was investigated. In literature, it is noteworthy that some theoretical models have been proposed to explain the reasons for internet addiction. In addition to these theoretical models, it may be thought that the coping style for stressing events can be a predictor of internet addiction. It was aimed to test with logistic regression the effect of high school students' coping styles on internet addiction levels. Sample of the study consisted of 770 Turkish adolescents (471 girls, 299 boys) selected from high schools in the 2017-2018 academic year in İzmir province. Internet Addiction Test, Coping Scale for Child and Adolescents and a demographic information form were used in this study. The results of the logistic regression analysis indicated that the model of coping styles predicted internet addiction provides a statistically significant prediction of internet addiction. Gender does not predict whether or not to be addicted to the internet. The active coping style is not effective on internet addiction levels, while the avoiding and negative coping style are effective on internet addiction levels. With this model, % 79.1 of internet addiction in high school is estimated. The Negelkerke pseudo R2 indicated that the model accounted for %35 of the total variance. The results of this study on Turkish adolescents are similar to the results of other studies in the literature. It can be argued that avoiding and negative coping styles are important risk factors in the development of internet addiction.

Keywords: adolescents, coping, internet addiction, regression analysis

Procedia PDF Downloads 172
2454 A Real-Time Snore Detector Using Neural Networks and Selected Sound Features

Authors: Stelios A. Mitilineos, Nicolas-Alexander Tatlas, Georgia Korompili, Lampros Kokkalas, Stelios M. Potirakis

Abstract:

Obstructive Sleep Apnea Hypopnea Syndrome (OSAHS) is a widespread chronic disease that mostly remains undetected, mainly due to the fact that it is diagnosed via polysomnography which is a time and resource-intensive procedure. Screening the disease’s symptoms at home could be used as an alternative approach in order to alert individuals that potentially suffer from OSAHS without compromising their everyday routine. Since snoring is usually linked to OSAHS, developing a snore detector is appealing as an enabling technology for screening OSAHS at home using ubiquitous equipment like commodity microphones (included in, e.g., smartphones). In this context, this study developed a snore detection tool and herein present the approach and selection of specific sound features that discriminate snoring vs. environmental sounds, as well as the performance of the proposed tool. Furthermore, a Real-Time Snore Detector (RTSD) is built upon the snore detection tool and employed in whole-night sleep sound recordings resulting to a large dataset of snoring sound excerpts that are made freely available to the public. The RTSD may be used either as a stand-alone tool that offers insight to an individual’s sleep quality or as an independent component of OSAHS screening applications in future developments.

Keywords: obstructive sleep apnea hypopnea syndrome, apnea screening, snoring detection, machine learning, neural networks

Procedia PDF Downloads 207
2453 Design and Optimization of a Mini High Altitude Long Endurance (HALE) Multi-Role Unmanned Aerial Vehicle

Authors: Vishaal Subramanian, Annuatha Vinod Kumar, Santosh Kumar Budankayala, M. Senthil Kumar

Abstract:

This paper discusses the aerodynamic and structural design, simulation and optimization of a mini-High Altitude Long Endurance (HALE) UAV. The applications of this mini HALE UAV vary from aerial topological surveys, quick first aid supply, emergency medical blood transport, search and relief activates to border patrol, surveillance and estimation of forest fire progression. Although classified as a mini UAV according to UVS International, our design is an amalgamation of the features of ‘mini’ and ‘HALE’ categories, combining the light weight of the ‘mini’ and the high altitude ceiling and endurance of the HALE. Designed with the idea of implementation in India, it is in strict compliance with the UAS rules proposed by the office of the Director General of Civil Aviation. The plane can be completely automated or have partial override control and is equipped with an Infra-Red camera and a multi coloured camera with on-board storage or live telemetry, GPS system with Geo Fencing and fail safe measures. An additional of 1.5 kg payload can be attached to three major hard points on the aircraft and can comprise of delicate equipment or releasable payloads. The paper details the design, optimization process and the simulations performed using various software such as Design Foil, XFLR5, Solidworks and Ansys.

Keywords: aircraft, endurance, HALE, high altitude, long range, UAV, unmanned aerial vehicle

Procedia PDF Downloads 395
2452 Optimal 3D Deployment and Path Planning of Multiple Uavs for Maximum Coverage and Autonomy

Authors: Indu Chandran, Shubham Sharma, Rohan Mehta, Vipin Kizheppatt

Abstract:

Unmanned aerial vehicles are increasingly being explored as the most promising solution to disaster monitoring, assessment, and recovery. Current relief operations heavily rely on intelligent robot swarms to capture the damage caused, provide timely rescue, and create road maps for the victims. To perform these time-critical missions, efficient path planning that ensures quick coverage of the area is vital. This study aims to develop a technically balanced approach to provide maximum coverage of the affected area in a minimum time using the optimal number of UAVs. A coverage trajectory is designed through area decomposition and task assignment. To perform efficient and autonomous coverage mission, solution to a TSP-based optimization problem using meta-heuristic approaches is designed to allocate waypoints to the UAVs of different flight capacities. The study exploits multi-agent simulations like PX4-SITL and QGroundcontrol through the ROS framework and visualizes the dynamics of UAV deployment to different search paths in a 3D Gazebo environment. Through detailed theoretical analysis and simulation tests, we illustrate the optimality and efficiency of the proposed methodologies.

Keywords: area coverage, coverage path planning, heuristic algorithm, mission monitoring, optimization, task assignment, unmanned aerial vehicles

Procedia PDF Downloads 213
2451 Numerical Simulation of Transient 3D Temperature and Kerf Formation in Laser Fusion Cutting

Authors: Karim Kheloufi, El Hachemi Amara

Abstract:

In the present study, a three-dimensional transient numerical model was developed to study the temperature field and cutting kerf shape during laser fusion cutting. The finite volume model has been constructed, based on the Navier–Stokes equations and energy conservation equation for the description of momentum and heat transport phenomena, and the Volume of Fluid (VOF) method for free surface tracking. The Fresnel absorption model is used to handle the absorption of the incident wave by the surface of the liquid metal and the enthalpy-porosity technique is employed to account for the latent heat during melting and solidification of the material. To model the physical phenomena occurring at the liquid film/gas interface, including momentum/heat transfer, a new approach is proposed which consists of treating friction force, pressure force applied by the gas jet and the heat absorbed by the cutting front surface as source terms incorporated into the governing equations. All these physics are coupled and solved simultaneously in Fluent CFD®. The main objective of using a transient phase change model in the current case is to simulate the dynamics and geometry of a growing laser-cutting generated kerf until it becomes fully developed. The model is used to investigate the effect of some process parameters on temperature fields and the formed kerf geometry.

Keywords: laser cutting, numerical simulation, heat transfer, fluid flow

Procedia PDF Downloads 338
2450 Revealing the Potential of Geotourism and Geoheritage of Gedangsari Area, Yogyakarta

Authors: Cecilia Jatu, Adventino

Abstract:

Gedangsari is located in Gunungkidul, Yogyakarta Province, which has several criteria to be used as a new geosite object. The research area is located in the southern mountain zone of Java, composed of 5 rock formations with Oligocene up to Middle Miocene age. The purpose of this study is to reveal the potential of geotourism and the geoheritage to be proposed as a new geosite and to make a geosite map of Gedangsari. The research method used is descriptive data collection and which includes quantitative geological data collection, geotourism, and heritage sites, then supported by petrographic analysis, geological structure, geological mapping, and SWOT analysis. The geological data proved that Gedangsari consists of igneous rock (intrusion), pyroclastic rock, and sediment rock. This condition caused many varieties and particular geomorphological platform. Geotourism that include in Gedangsari are Luweng Sampang Canyon, Gedangsari Bouma Sequence, Watugajah Columnar Joint, Gedangsari Marine Fan Sediment, and Tegalrejo Waterfall. There is also Tegalrejo Village, which can be considered as geoheritage site because of its culture and batik traditional cloth. The results of the SWOT analysis, Gedangsari geosite must be developed and appropriately promoted in order to improve the existence. The development of geosite area will have a significant impact that improve the economic growth of the surrounding community and can be used by the government as base information for sustainable development. In addition, the making of an educational map about the geological conditions and geotourism location of the Gedangsari geosite can increase the people's knowledge about Gedangsari.

Keywords: Gedangsari, geoheritage, geotourism, geosite

Procedia PDF Downloads 120
2449 High Piezoelectric and Magnetic Performance Achieved in the Lead-free BiFeO3-BaTiO3 Cceramics by Defect Engineering

Authors: Muhammad Habib, Xuefan Zhou, Lin Tang, Guoliang Xue, Fazli Akram, Dou Zhang

Abstract:

Defect engineering approach is a well-established approach for the customization of functional properties of perovskite ceramics. In modern technology, the high multiferroic properties for elevated temperature applications are greatly demanding. In this work, the Bi-nonstoichiometric lead-free 0.67Biy-xSmxFeO3-0.33BaTiO3 ceramics (Sm-doped BF-BT for Bi-excess; y = 1.03 and Bi-deficient; y = 0.975 with x = 0.00, 0.04 and 0.08) were design for the high-temperature multiferroic property. Enhanced piezoelectric (d33  250 pC/N and d33* 350 pm/V) and magnetic properties (Mr  0.25 emu/g) with a high Curie temperature (TC  465 ℃) were obtained in the Bi-deficient pure BF-BT ceramics. With Sm-doping (x = 0.04), the TC decrease to 350 ℃ a significant improvement occurred in the d33* to 504 pm/V and 450 pm/V for Bi-excess and Bi-deficient compositions, respectively. The structural origin of the enhanced piezoelectric strain performance is related to the soft ferroelectric effect by Sm-doping and reversible phase transition from the short-range relaxor ferroelectric state to the long-range order under the applied electric field. However, a slight change occurs in the Mr 0.28 emu/g value with Sm-doping for Bi-deficient ceramics, whereas the Bi-excess ceramics shows completely paramagnetic behavior. Hence, the origin of high magnetic properties in the Bi-deficient BF-BT ceramics is mainly attributed to the proposed double exchange mechanism. We believe that this strategy will provide a new perspective for the development of lead-free multiferroic ceramics for high-temperature applications.

Keywords: BiFeO3-BaTiO3, lead-free piezoceramics, magnetic properties, defect engineering

Procedia PDF Downloads 132
2448 Evaluation of the Power Generation Effect Obtained by Inserting a Piezoelectric Sheet in the Backlash Clearance of a Circular Arc Helical Gear

Authors: Barenten Suciu, Yuya Nakamoto

Abstract:

Power generation effect, obtained by inserting a piezo- electric sheet in the backlash clearance of a circular arc helical gear, is evaluated. Such type of screw gear is preferred since, in comparison with the involute tooth profile, the circular arc profile leads to reduced stress-concentration effects, and improved life of the piezoelectric film. Firstly, geometry of the circular arc helical gear, and properties of the piezoelectric sheet are presented. Then, description of the test-rig, consisted of a right-hand thread gear meshing with a left-hand thread gear, and the voltage measurement procedure are given. After creating the tridimensional (3D) model of the meshing gears in SolidWorks, they are 3D-printed in acrylonitrile butadiene styrene (ABS) resin. Variation of the generated voltage versus time, during a meshing cycle of the circular arc helical gear, is measured for various values of the center distance. Then, the change of the maximal, minimal, and peak-to-peak voltage versus the center distance is illustrated. Optimal center distance of the gear, to achieve voltage maximization, is found and its significance is discussed. Such results prove that the contact pressure of the meshing gears can be measured, and also, the electrical power can be generated by employing the proposed technique.

Keywords: circular arc helical gear, contact problem, optimal center distance, piezoelectric sheet, power generation

Procedia PDF Downloads 166
2447 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser

Authors: Guanqiao Wang, Hongyang Yu

Abstract:

There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.

Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing

Procedia PDF Downloads 146
2446 An Application of Vector Error Correction Model to Assess Financial Innovation Impact on Economic Growth of Bangladesh

Authors: Md. Qamruzzaman, Wei Jianguo

Abstract:

Over the decade, it is observed that financial development, through financial innovation, not only accelerated development of efficient and effective financial system but also act as a catalyst in the economic development process. In this study, we try to explore insight about how financial innovation causes economic growth in Bangladesh by using Vector Error Correction Model (VECM) for the period of 1990-2014. Test of Cointegration confirms the existence of a long-run association between financial innovation and economic growth. For investigating directional causality, we apply Granger causality test and estimation explore that long-run growth will be affected by capital flow from non-bank financial institutions and inflation in the economy but changes of growth rate do not have any impact on Capital flow in the economy and level of inflation in long-run. Whereas, growth and Market capitalization, as well as market capitalization and capital flow, confirm feedback hypothesis. Variance decomposition suggests that any innovation in the financial sector can cause GDP variation fluctuation in both long run and short run. Financial innovation promotes efficiency and cost in financial transactions in the financial system, can boost economic development process. The study proposed two policy recommendations for further development. First, innovation friendly financial policy should formulate to encourage adaption and diffusion of financial innovation in the financial system. Second, operation of financial market and capital market should be regulated with implementation of rules and regulation to create conducive environment.

Keywords: financial innovation, economic growth, GDP, financial institution, VECM

Procedia PDF Downloads 269
2445 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 149
2444 Direct Measurement of Pressure and Temperature Variations During High-Speed Friction Experiments

Authors: Simon Guerin-Marthe, Marie Violay

Abstract:

Thermal Pressurization (TP) has been proposed as a key mechanism involved in the weakening of faults during dynamic ruptures. Theoretical and numerical studies clearly show how frictional heating can lead to an increase in pore fluid pressure due to the rapid slip along faults occurring during earthquakes. In addition, recent laboratory studies have evidenced local pore pressure or local temperature variation during rotary shear tests, which are consistent with TP theoretical and numerical models. The aim of this study is to complement previous ones by measuring both local pore pressure and local temperature variations in the vicinity of a water-saturated calcite gouge layer subjected to a controlled slip velocity in direct double shear configuration. Laboratory investigation of TP process is crucial in order to understand the conditions at which it is likely to become a dominant mechanism controlling dynamic friction. It is also important in order to understand the timing and magnitude of temperature and pore pressure variations, to help understanding when it is negligible, and how it competes with other rather strengthening-mechanisms such as dilatancy, which can occur during rock failure. Here we present unique direct measurements of temperature and pressure variations during high-speed friction experiments under various load point velocities and show the timing of these variations relatively to the slip event.

Keywords: thermal pressurization, double-shear test, high-speed friction, dilatancy

Procedia PDF Downloads 60
2443 Impact of Lifestyle and User Expectations on the Demand of Compact Living Spaces in the Home Interiors in Indian Cities

Authors: Velly Kapadia, Reenu Singh

Abstract:

This report identifies the long-term driving forces behind urbanization and the impact of compact living on both society and the home and proposes a concept to create smarter and more sustainable homes. Compact living has been trending across India as a sustainable housing solution, and the reality is that India is currently facing a housing shortage in urban areas of around 10 million units. With the rising demand for housing, urban land prices have been rising and the cost of homes. The paper explores how and why the interior design of the homes can be improved to relieve the housing demand in an environmentally, socially and economically sustainable manner. A questionnaire survey was conducted to determine living patterns, area requirements, ecological footprints, energy consumption, purchasing patterns, and various pro-environmental behaviors of people who downsize to compact homes. Quantitative research explores sustainable material choices, durability, functionality, cost, and reusability of furniture. Besides addressing the need for smart and sustainable designed compact homes, a conceptual model is proposed, including options of ideal schematic layouts for homes in urban areas. In the conclusions, suggestions to improve space planning and suitable interior entities have been made to support the fact that compact homes are an eminently practical and sensible solution for the urban citizen.

Keywords: compact living, housing shortage, lifestyle, sustainable interior design

Procedia PDF Downloads 201
2442 Productivity-Emotiveness Model of School Students’ Capacity Levels

Authors: Ivan Samokhin

Abstract:

A new two-factor model of school students’ capacity levels is proposed. It considers the academic productivity and emotional condition of children taking part in the study process. Each basic level reflects the correlation of these two factors. The teacher decides whether the required result is achieved or not and write down the grade (from 'A' to 'F') in the register. During the term, the teacher can estimate the students’ progress with any intervals, but it is not desirable to exceed a two-week period (with primary school being an exception). Each boy or girl should have a special notebook to record the emotions which they feel studying a subject. The children can make their notes the way they like it – for example, using a ten-point scale or a short verbal description. It is recommended to record the emotions twice a day: after the lesson and after doing the homework. Before the students start doing this, they should be instructed by a school psychologist, who has to emphasize that an attitude to the subject – not to a person in charge of it – is relevant. At the end of the term, the notebooks are given to the teacher, who is now able to make preliminary conclusions about academic results and psychological comfort of each student. If necessary, some pedagogical measures can be taken. The data about a supposed capacity level is available for the teacher and the school administration. In certain cases, this information can be also revealed to the student’s parents, while the student learns it only after receiving a school-leaving certificate (until this moment, the results are not considered ultimate). Then a person may take these data into consideration when choosing his/her future area of higher education. We single out four main capacity levels: 'nominally low', 'inclination', 'ability' and 'gift'.

Keywords: academic productivity, capacity level, emotional condition, school students

Procedia PDF Downloads 224
2441 A Location-based Authentication and Key Management Scheme for Border Surveillance Wireless Sensor Networks

Authors: Walid Abdallah, Noureddine Boudriga

Abstract:

Wireless sensor networks have shown their effectiveness in the deployment of many critical applications especially in the military domain. Border surveillance is one of these applications where a set of wireless sensors are deployed along a country border line to detect illegal intrusion attempts to the national territory and report this to a control center to undergo the necessary measures. Regarding its nature, this wireless sensor network can be the target of many security attacks trying to compromise its normal operation. Particularly, in this application the deployment and location of sensor nodes are of great importance for detecting and tracking intruders. This paper proposes a location-based authentication and key distribution mechanism to secure wireless sensor networks intended for border surveillance where the key establishment is performed using elliptic curve cryptography and identity-based public key scheme. In this scheme, the public key of each sensor node will be authenticated by keys that depend on its position in the monitored area. Before establishing a pairwise key between two nodes, each one of them must verify the neighborhood location of the other node using a message authentication code (MAC) calculated on the corresponding public key and keys derived from encrypted beacon messages broadcast by anchor nodes. We show that our proposed public key authentication and key distribution scheme is more resilient to node capture and node replication attacks than currently available schemes. Also, the achievement of the key distribution between nodes in our scheme generates less communication overhead and hence increases network performances.

Keywords: wireless sensor networks, border surveillance, security, key distribution, location-based

Procedia PDF Downloads 658
2440 A Hybrid Genetic Algorithm and Neural Network for Wind Profile Estimation

Authors: M. Saiful Islam, M. Mohandes, S. Rehman, S. Badran

Abstract:

Increasing necessity of wind power is directing us to have precise knowledge on wind resources. Methodical investigation of potential locations is required for wind power deployment. High penetration of wind energy to the grid is leading multi megawatt installations with huge investment cost. This fact appeals to determine appropriate places for wind farm operation. For accurate assessment, detailed examination of wind speed profile, relative humidity, temperature and other geological or atmospheric parameters are required. Among all of these uncertainty factors influencing wind power estimation, vertical extrapolation of wind speed is perhaps the most difficult and critical one. Different approaches have been used for the extrapolation of wind speed to hub height which are mainly based on Log law, Power law and various modifications of the two. This paper proposes a Artificial Neural Network (ANN) and Genetic Algorithm (GA) based hybrid model, namely GA-NN for vertical extrapolation of wind speed. This model is very simple in a sense that it does not require any parametric estimations like wind shear coefficient, roughness length or atmospheric stability and also reliable compared to other methods. This model uses available measured wind speeds at 10m, 20m and 30m heights to estimate wind speeds up to 100m. A good comparison is found between measured and estimated wind speeds at 30m and 40m with approximately 3% mean absolute percentage error. Comparisons with ANN and power law, further prove the feasibility of the proposed method.

Keywords: wind profile, vertical extrapolation of wind, genetic algorithm, artificial neural network, hybrid machine learning

Procedia PDF Downloads 488