Search results for: proposed drought severity index
2479 River Offtake Management Using Mathematical Modelling Tool: A Case Study of the Gorai River, Bangladesh
Authors: Sarwat Jahan, Asker Rajin Rahman
Abstract:
Management of offtake of any fluvial river is very sensitive in terms of long-term sustainability where the variation of water flow and sediment transport range are wide enough throughout a hydrological year. The Gorai River is a major distributary of the Ganges River in Bangladesh and is termed as a primary source of fresh water for the South-West part of the country. Every year, significant siltation of the Gorai offtake disconnects it from the Ganges during the dry season. As a result, the socio-economic and environmental condition of the downstream areas has been deteriorating for a few decades. To improve the overall situation of the Gorai offtake and its dependent areas, a study has been conducted by the Institute of Water Modelling, Bangladesh, in 2022. Using the mathematical morphological modeling tool MIKE 21C of DHI Water & Environment, Denmark, simulated results revealed the need for dredging/river training structures for offtake management at the Gorai offtake to ensure significant dry season flow towards the downstream. The dry season flow is found to increase significantly with the proposed river interventions, which also improves the environmental conditions in terms of salinity of the South-West zone of the country. This paper summarizes the primary findings of the analyzed results of the developed mathematical model for improving the existing condition of the Gorai River.Keywords: Gorai river, mathematical modelling, offtake, siltation, salinity
Procedia PDF Downloads 982478 A Study of Carbon Emissions during Building Construction
Authors: Jonggeon Lee, Sungho Tae, Sungjoon Suk, Keunhyeok Yang, George Ford, Michael E. Smith, Omidreza Shoghli
Abstract:
In recent years, research to reduce carbon emissions through quantitative assessment of building life cycle carbon emissions has been performed as it relates to the construction industry. However, most research efforts related to building carbon emissions assessment have been focused on evaluation during the operational phase of a building’s life span. Few comprehensive studies of the carbon emissions during a building’s construction phase have been performed. The purpose of this study is to propose an assessment method that quantitatively evaluates the carbon emissions of buildings during the construction phase. The study analysed the amount of carbon emissions produced by 17 construction trades, and selected four construction trades that result in high levels of carbon emissions: reinforced concrete work; sheathing work; foundation work; and form work. Building materials, and construction and transport equipment used for the selected construction trades were identified, and carbon emissions produced by the identified materials and equipment were calculated for these four construction trades. The energy consumption of construction and transport equipment was calculated by analysing fuel efficiency and equipment productivity rates. The combination of the expected levels of carbon emissions associated with the utilization of building materials and construction equipment provides means for estimating the quantity of carbon emissions related to the construction phase of a building’s life cycle. The proposed carbon emissions assessment method was validated by case studies.Keywords: building construction phase, carbon emissions assessment, building life cycle
Procedia PDF Downloads 7512477 The Effectiveness of Spatial Planning And Land Use Management Act, 2013 in Fetakgomo Tubatse Local Municipality: Case Study of Apel Nodal Point
Authors: Hlabishi Peter Ntloana
Abstract:
This paper aims to present the effectiveness of the Spatial Planning and Land Use Management Act, 2013, in addressing key spatial challenges in Fetakgomo Tubatse Local Municipality, mainly focusing on Apel nodal point. Spatial Planning and Land Use Management Act, 2013, popularly known as SPLUMA, aimed at addressing emerging and existing spatial planning and land use management challenges in South Africa. There are critical key spatial challenges that are continuously encountered in Apel Nodal Point, which include dispersed rural settlement mainly in a communal settlement. The spatial patterns and rural settlements development patterns are a challenge, and such results in uncoordinated human settlements. The objective of this research paper is to analyze the spatial planning of Apel nodal points and determine the effectiveness of the SPLUMA policy. Key Informant interviews were conducted with 20 participants, and also the municipal Spatial Development Framework was considered to explore more challenges and proposed recommendations. The results divulged that there is a huge gap in addressing spatial planning, mainly in rural areas, and correlation with the findings of the Municipal Spatial Development framework. In conclusion, spatial planning remains a critical dilemma in most rural settlements, and there must be programmes and strategies to balance the effectiveness of spatial planning in urban and rural settlements.Keywords: land use management, rural settlement, spatial development framework, spatial planning
Procedia PDF Downloads 1782476 Digital Joint Equivalent Channel Hybrid Precoding for Millimeterwave Massive Multiple Input Multiple Output Systems
Authors: Linyu Wang, Mingjun Zhu, Jianhong Xiang, Hanyu Jiang
Abstract:
Aiming at the problem that the spectral efficiency of hybrid precoding (HP) is too low in the current millimeter wave (mmWave) massive multiple input multiple output (MIMO) system, this paper proposes a digital joint equivalent channel hybrid precoding algorithm, which is based on the introduction of digital encoding matrix iteration. First, the objective function is expanded to obtain the relation equation, and the pseudo-inverse iterative function of the analog encoder is derived by using the pseudo-inverse method, which solves the problem of greatly increasing the amount of computation caused by the lack of rank of the digital encoding matrix and reduces the overall complexity of hybrid precoding. Secondly, the analog coding matrix and the millimeter-wave sparse channel matrix are combined into an equivalent channel, and then the equivalent channel is subjected to Singular Value Decomposition (SVD) to obtain a digital coding matrix, and then the derived pseudo-inverse iterative function is used to iteratively regenerate the simulated encoding matrix. The simulation results show that the proposed algorithm improves the system spectral efficiency by 10~20%compared with other algorithms and the stability is also improved.Keywords: mmWave, massive MIMO, hybrid precoding, singular value decompositing, equivalent channel
Procedia PDF Downloads 962475 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators
Authors: Andrea Bellucci, Martina Tofi
Abstract:
The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers
Procedia PDF Downloads 2992474 Critical Factors Affecting the Implementation of Total Quality Management in the Construction Industry in U. A. E.
Authors: Firas Mohamad Al-Sabek
Abstract:
The Purpose of the paper is to examine the most critical and important factor which will affect the implementation of Total Quality Management (TQM) in the construction industry in the United Arab Emirates. It also examines the most effected Project outcome from implementing TQM. A framework was also proposed depending on the literature studies. The method used in this paper is a quantitative study. A survey with a sample of 60 respondents was created and distributed in a construction company in Abu Dhabi, which includes 15 questions to examine the most critical factor that will affect the implementation of TQM in addition to the most effected project outcome from implementing TQM. The survey showed that management commitment is the most important factor in implementing TQM in a construction company. Also it showed that Project cost is most effected outcome from the implementation of TQM. Management commitment is very important for implementing TQM in any company. If the management loose interest in quality then everyone in the organization will do so. The success of TQM will depend mostly on the top of the pyramid. Also cost is reduced and money is saved when the project team implement TQM. While if no quality measures are present within the team, the project will suffer a commercial failure. Based on literature, more factors can be examined and added to the model. In addition, more construction companies could be surveyed in order to obtain more accurate results. Also this study could be conducted outside the United Arab Emirates for further enchantment.Keywords: construction project, total quality management, management commitment, cost, theoretical framework
Procedia PDF Downloads 4262473 Investigating the Pedestrian Willingness to Pay to Choose Appropriate Policies for Improving the Safety of Pedestrian Facilities
Authors: Babak Mirbaha, Mahmoud Saffarzadeh, Fatemeh Mohajeri
Abstract:
Road traffic accidents lead to a higher rate of death and injury, especially in vulnerable road users such as pedestrians. Improving the safety of facilities for pedestrians is a major concern for policymakers because of the high number of pedestrian fatalities and direct and indirect costs which are imposed to the society. This study focuses on the idea of determining the willingness to pay of pedestrians for increasing their safety while crossing the street. In this study, three different scenarios including crossing the street with zebra crossing facilities, crossing the street with zebra crossing facilities and installing a pedestrian traffic light and constructing a pedestrian bridge with escalator are presented. The research was conducted based on stated preferences method. The required data were collected from a questionnaire that consisted of three parts: pedestrian’s demographic characteristics, travel characteristics and scenarios. Four different payment amounts are presented for each scenario and a logit model has been built for each proposed payment. The results show that sex, age, education, average household income and individual salary have significant effect on choosing a scenario. Among the policies that have been mentioned through the questionnaire scenarios, the scenario of crossing the street with zebra crossing facilities and installing a traffic lights is the most frequent, with willingness to pay 10,000 Rials and the scenario of crossing the street with a zebra crossing with a willingness to pay 100,000 Rials having the least frequency. For all scenarios, as the payment is increasing, the willingness to pay decreases.Keywords: pedestrians, willingness to pay, safety, immunization
Procedia PDF Downloads 1562472 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications
Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison
Abstract:
In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.Keywords: economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller
Procedia PDF Downloads 2382471 Flow: A Fourth Musical Element
Authors: James R. Wilson
Abstract:
Music is typically defined as having the attributes of melody, harmony, and rhythm. In this paper, a fourth element is proposed -"flow". "Flow" is a new dimension in music that has always been present but only recently identified and measured. The Adagio "Flow Machine" enables us to envision this component and even suggests a new approach to music theory and analysis. The Adagio was created specifically to measure the underlying “flow” in music. The Adagio is an entirely new way to experience and visualize the music, to assist in performing music (both as a conductor and/or performer), and to provide a whole new methodology for music analysis and theory. The Adagio utilizes musical “hit points”, such as a transition from one musical section to another (for example, in a musical composition utilizing the sonata form, a transition from the exposition to the development section) to help define the compositions flow rate. Once the flow rate is established, the Adagio can be used to determine if the composer/performer/conductor has correctly maintained the proper rate of flow throughout the performance. An example is provided using Mozart’s Piano Concerto Number 21. Working with the Adagio yielded an unexpected windfall; it was determined via an empirical study conducted at Nova University’s Biofeedback Lab that watching the Adagio helped volunteers participating in a controlled experiment recover from stressors significantly faster than the control group. The Adagio can be thought of as a new arrow in the Musicologist's quiver. It provides a new, unique way of viewing the psychological impact and esthetic effectiveness of music composition. Additionally, with the current worldwide access to multi-media via the internet, flow analysis can be performed and shared with others with little time and/or expense.Keywords: musicology, music analysis, music flow, music therapy
Procedia PDF Downloads 1772470 Defect Classification of Hydrogen Fuel Pressure Vessels using Deep Learning
Authors: Dongju Kim, Youngjoo Suh, Hyojin Kim, Gyeongyeong Kim
Abstract:
Acoustic Emission Testing (AET) is widely used to test the structural integrity of an operational hydrogen storage container, and clustering algorithms are frequently used in pattern recognition methods to interpret AET results. However, the interpretation of AET results can vary from user to user as the tuning of the relevant parameters relies on the user's experience and knowledge of AET. Therefore, it is necessary to use a deep learning model to identify patterns in acoustic emission (AE) signal data that can be used to classify defects instead. In this paper, a deep learning-based model for classifying the types of defects in hydrogen storage tanks, using AE sensor waveforms, is proposed. As hydrogen storage tanks are commonly constructed using carbon fiber reinforced polymer composite (CFRP), a defect classification dataset is collected through a tensile test on a specimen of CFRP with an AE sensor attached. The performance of the classification model, using one-dimensional convolutional neural network (1-D CNN) and synthetic minority oversampling technique (SMOTE) data augmentation, achieved 91.09% accuracy for each defect. It is expected that the deep learning classification model in this paper, used with AET, will help in evaluating the operational safety of hydrogen storage containers.Keywords: acoustic emission testing, carbon fiber reinforced polymer composite, one-dimensional convolutional neural network, smote data augmentation
Procedia PDF Downloads 942469 Post Injury Experiences of New Immigrant Workers
Authors: Janki Shankar, Shu Ping Chen
Abstract:
Background: New immigrants are one of most vulnerable sections of the Canadian society. Unable to gain entry into Canada’s strictly regulated professions and trades, several skilled and qualified new immigrants take up precarious jobs without adequate occupational health and safety training, thereby increasing their risk of sustaining occupational injury and illness compared to Canadian born workers. Access to timely and appropriate support is critical for injured new immigrant workers who face additional challenges compared to Canadian born workers in accessing information and support post-injury. The purpose of our study was to explore the post-injury experiences and support needs of new immigrant workers who have sustained work-related injuries. Methods: Using an interpretive research approach and semi structured face to face qualitative interviews, 27 new immigrant workers from a range of industries operating in two cities in a province in Canada were interviewed. All had sustained work-related injuries and reported these to their work supervisors. A constant comparative approach was used to identify key themes across the worker experiences. Results: Findings reveal several factors that can shape the experiences of new immigrant workers and influence their return-to-work outcomes. Conclusion: Based on the insights of study participants, policies, practices, and potential interventions informed by their needs and preferences are proposed that can improve return to work outcomes for these workers.Keywords: new immigrant workers, post-injury experiences, return to work outcomes, qualified
Procedia PDF Downloads 1012468 PaSA: A Dataset for Patent Sentiment Analysis to Highlight Patent Paragraphs
Authors: Renukswamy Chikkamath, Vishvapalsinhji Ramsinh Parmar, Christoph Hewel, Markus Endres
Abstract:
Given a patent document, identifying distinct semantic annotations is an interesting research aspect. Text annotation helps the patent practitioners such as examiners and patent attorneys to quickly identify the key arguments of any invention, successively providing a timely marking of a patent text. In the process of manual patent analysis, to attain better readability, recognising the semantic information by marking paragraphs is in practice. This semantic annotation process is laborious and time-consuming. To alleviate such a problem, we proposed a dataset to train machine learning algorithms to automate the highlighting process. The contributions of this work are: i) we developed a multi-class dataset of size 150k samples by traversing USPTO patents over a decade, ii) articulated statistics and distributions of data using imperative exploratory data analysis, iii) baseline Machine Learning models are developed to utilize the dataset to address patent paragraph highlighting task, and iv) future path to extend this work using Deep Learning and domain-specific pre-trained language models to develop a tool to highlight is provided. This work assists patent practitioners in highlighting semantic information automatically and aids in creating a sustainable and efficient patent analysis using the aptitude of machine learning.Keywords: machine learning, patents, patent sentiment analysis, patent information retrieval
Procedia PDF Downloads 912467 Caged in Concrete Jungles: Reasserting Cultural Identity and Environmental Sustainability through Material Choice and Design Expression in Architecture
Authors: Ikenna Michael Onuorah
Abstract:
The relentless march of globalization in architecture has led to a homogenization of built environments, often characterized by an overreliance on imported, resource-intensive materials and a disregard for local cultural contexts. This research posits that such practices pose significant environmental and cultural perils, trapping communities in "caged concrete jungles" devoid of both ecological sustainability and a meaningful connection to their heritage. Through a mixed-method approach encompassing quantitative and qualitative data analysis, the study investigated the impacts of neglecting local materials and cultural expression in architectural design. The research is anticipated to yield significant insights into the multifaceted consequences of neglecting locally available materials and cultural expression in architecture. It creates a compelling case for reasserting local materials and cultural expression in architectural design. Based on the anticipated research findings, the study proposed series of actionable recommendations for architects, policymakers, and communities to promote sustainable and culturally sensitive built environments. This will serve as a wake-up call, urging architects, policymakers, and communities to break free from the confines of "caged concrete jungles" and embrace a more sustainable and culturally sensitive approach to design.Keywords: sustainability, cultural identity, building materials, sustainable dsigns
Procedia PDF Downloads 562466 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data
Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query
Procedia PDF Downloads 1622465 Text as Reader Device Improving Subjectivity on the Role of Attestation between Interpretative Semiotics and Discursive Linguistics
Authors: Marco Castagna
Abstract:
Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Keywords: attestation, meaning, reader, text
Procedia PDF Downloads 2372464 The Product Innovation Using Nutraceutical Delivery System on Improving Growth Performance of Broiler
Authors: Kitti Supchukun, Kris Angkanaporn, Teerapong Yata
Abstract:
The product innovation using a nutraceutical delivery system on improving the growth performance of broilers is the product planning and development to solve the antibiotics banning policy incurred in the local and global livestock production system. Restricting the use of antibiotics can reduce the quality of chicken meat and increase pathogenic bacterial contamination. Although other alternatives were used to replace antibiotics, the efficacy was inconsistent, reflecting on low chicken growth performance and contaminated products. The product innovation aims to effectively deliver the selected active ingredients into the body. This product is tested on the pharmaceutical lab scale and on the farm-scale for market feasibility in order to create product innovation using the nutraceutical delivery system model. The model establishes the product standardization and traceable quality control process for farmers. The study is performed using mixed methods. Starting with a qualitative method to find the farmers' (consumers) demands and the product standard, then the researcher used the quantitative research method to develop and conclude the findings regarding the acceptance of the technology and product performance. The survey has been sent to different organizations by random sampling among the entrepreneur’s population including integrated broiler farm, broiler farm, and other related organizations. The mixed-method results, both qualitative and quantitative, verify the user and lead users' demands since they provide information about the industry standard, technology preference, developing the right product according to the market, and solutions for the industry problems. The product innovation selected nutraceutical ingredients that can solve the following problems in livestock; bactericidal, anti-inflammation, gut health, antioxidant. The combinations of the selected nutraceutical and nanostructured lipid carriers (NLC) technology aim to improve chemical and pharmaceutical components by changing the structure of active ingredients into nanoparticle, which will be released in the targeted location with accurate concentration. The active ingredients in nanoparticle form are more stable, elicit antibacterial activity against pathogenic Salmonella spp and E.coli, balance gut health, have antioxidant and anti-inflammation activity. The experiment results have proven that the nutraceuticals have an antioxidant and antibacterial activity which also increases the average daily gain (ADG), reduces feed conversion ratio (FCR). The results also show a significant impact on the higher European Performance Index that can increase the farmers' profit when exporting. The product innovation will be tested in technology acceptance management methods from farmers and industry. The production of broiler and commercialization analyses are useful to reduce the importation of animal supplements. Most importantly, product innovation is protected by intellectual property.Keywords: nutraceutical, nano structure lipid carrier, anti-microbial drug resistance, broiler, Salmonella
Procedia PDF Downloads 1792463 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area
Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras
Abstract:
Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.Keywords: empirical models, RMSE, shear wave velocity, standard penetration test
Procedia PDF Downloads 3382462 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 1822461 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 852460 Constructions of Linear and Robust Codes Based on Wavelet Decompositions
Authors: Alla Levina, Sergey Taranov
Abstract:
The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability
Procedia PDF Downloads 4892459 Outdoor Thermal Comfort Strategies: The Case of Cool Facades
Authors: Noelia L. Alchapar, Cláudia C. Pezzuto, Erica N. Correa
Abstract:
Mitigating urban overheating is key to achieving the environmental and energy sustainability of cities. The management of the optical properties of the materials that make up the urban envelope -roofing, pavement, and facades- constitutes a profitable and effective tool to improve the urban microclimate and rehabilitate urban areas. Each material that makes up the urban envelope has a different capacity to reflect received solar radiation, which alters the fraction of solar radiation absorbed by the city. However, the paradigm of increasing solar reflectance in all areas of the city without distinguishing their relative position within the urban canyon can cause serious problems of overheating and discomfort among its inhabitants. The hypothesis that supports the research postulates that not all reflective technologies that contribute to urban radiative cooling favor the thermal comfort conditions of pedestrians to equal measure. The objective of this work is to determine to what degree the management of the optical properties of the facades modifies outdoor thermal comfort, given that the mitigation potential of materials with high reflectance in facades is strongly conditioned by geographical variables and by the geometric characteristics of the urban profile aspect ratio (H/W). This research was carried out under two climatic contexts, that of the city of Mendoza-Argentina and that of the city of Campinas-Brazil, according to the Köppen climate classification: BWk and Cwa, respectively. Two areas in two different climatic contexts (Mendoza - Argentina and Campinas - Brazil) were selected. Both areas have comparable urban morphology patterns. These areas are located in a region with low horizontal building density and residential zoning. The microclimatic conditions were monitored during the summer period with temperature and humidity fixed sensors inside vial channels. The microclimate model was simulated in ENVI-Met V5. A grid resolution of 3.5 x 3.5 x 3.5m was used for both cities, totaling an area of 145x145x30 grids. Based on the validated theoretical model, ten scenarios were simulated, modifying the height of buildings and the solar reflectivity of facades. The solar reflectivity façades ranges were: low (0.3) and high (0.75). The density scenarios range from 1th to the 5th level. The study scenarios' performance was assessed by comparing the air temperature, physiological equivalent temperature (PET), and thermal climate index (UTCI). As a result, it is observed that the behavior of the materials of the urban outdoor space depends on complex interactions. Many urban environmental factors influence including constructive characteristics, urban morphology, geographic locations, local climate, and so forth. The role of the vertical urban envelope is decisive for the reduction of urban overheating. One of the causes of thermal gain is the multiple reflections within the urban canyon, which affects not only the air temperature but also the pedestrian thermal comfort. One of the main findings of this work leads to the remarkable importance of considering both the urban warming and the thermal comfort aspects of pedestrians in urban mitigation strategies.Keywords: materials facades, solar reflectivity, thermal comfort, urban cooling
Procedia PDF Downloads 922458 The Relationship between Coping Styles and Internet Addiction among High School Students
Authors: Adil Kaval, Digdem Muge Siyez
Abstract:
With the negative effects of internet use in a person's life, the use of the Internet has become an issue. This subject was mostly considered as internet addiction, and it was investigated. In literature, it is noteworthy that some theoretical models have been proposed to explain the reasons for internet addiction. In addition to these theoretical models, it may be thought that the coping style for stressing events can be a predictor of internet addiction. It was aimed to test with logistic regression the effect of high school students' coping styles on internet addiction levels. Sample of the study consisted of 770 Turkish adolescents (471 girls, 299 boys) selected from high schools in the 2017-2018 academic year in İzmir province. Internet Addiction Test, Coping Scale for Child and Adolescents and a demographic information form were used in this study. The results of the logistic regression analysis indicated that the model of coping styles predicted internet addiction provides a statistically significant prediction of internet addiction. Gender does not predict whether or not to be addicted to the internet. The active coping style is not effective on internet addiction levels, while the avoiding and negative coping style are effective on internet addiction levels. With this model, % 79.1 of internet addiction in high school is estimated. The Negelkerke pseudo R2 indicated that the model accounted for %35 of the total variance. The results of this study on Turkish adolescents are similar to the results of other studies in the literature. It can be argued that avoiding and negative coping styles are important risk factors in the development of internet addiction.Keywords: adolescents, coping, internet addiction, regression analysis
Procedia PDF Downloads 1742457 A Real-Time Snore Detector Using Neural Networks and Selected Sound Features
Authors: Stelios A. Mitilineos, Nicolas-Alexander Tatlas, Georgia Korompili, Lampros Kokkalas, Stelios M. Potirakis
Abstract:
Obstructive Sleep Apnea Hypopnea Syndrome (OSAHS) is a widespread chronic disease that mostly remains undetected, mainly due to the fact that it is diagnosed via polysomnography which is a time and resource-intensive procedure. Screening the disease’s symptoms at home could be used as an alternative approach in order to alert individuals that potentially suffer from OSAHS without compromising their everyday routine. Since snoring is usually linked to OSAHS, developing a snore detector is appealing as an enabling technology for screening OSAHS at home using ubiquitous equipment like commodity microphones (included in, e.g., smartphones). In this context, this study developed a snore detection tool and herein present the approach and selection of specific sound features that discriminate snoring vs. environmental sounds, as well as the performance of the proposed tool. Furthermore, a Real-Time Snore Detector (RTSD) is built upon the snore detection tool and employed in whole-night sleep sound recordings resulting to a large dataset of snoring sound excerpts that are made freely available to the public. The RTSD may be used either as a stand-alone tool that offers insight to an individual’s sleep quality or as an independent component of OSAHS screening applications in future developments.Keywords: obstructive sleep apnea hypopnea syndrome, apnea screening, snoring detection, machine learning, neural networks
Procedia PDF Downloads 2072456 Design and Optimization of a Mini High Altitude Long Endurance (HALE) Multi-Role Unmanned Aerial Vehicle
Authors: Vishaal Subramanian, Annuatha Vinod Kumar, Santosh Kumar Budankayala, M. Senthil Kumar
Abstract:
This paper discusses the aerodynamic and structural design, simulation and optimization of a mini-High Altitude Long Endurance (HALE) UAV. The applications of this mini HALE UAV vary from aerial topological surveys, quick first aid supply, emergency medical blood transport, search and relief activates to border patrol, surveillance and estimation of forest fire progression. Although classified as a mini UAV according to UVS International, our design is an amalgamation of the features of ‘mini’ and ‘HALE’ categories, combining the light weight of the ‘mini’ and the high altitude ceiling and endurance of the HALE. Designed with the idea of implementation in India, it is in strict compliance with the UAS rules proposed by the office of the Director General of Civil Aviation. The plane can be completely automated or have partial override control and is equipped with an Infra-Red camera and a multi coloured camera with on-board storage or live telemetry, GPS system with Geo Fencing and fail safe measures. An additional of 1.5 kg payload can be attached to three major hard points on the aircraft and can comprise of delicate equipment or releasable payloads. The paper details the design, optimization process and the simulations performed using various software such as Design Foil, XFLR5, Solidworks and Ansys.Keywords: aircraft, endurance, HALE, high altitude, long range, UAV, unmanned aerial vehicle
Procedia PDF Downloads 3972455 Optimal 3D Deployment and Path Planning of Multiple Uavs for Maximum Coverage and Autonomy
Authors: Indu Chandran, Shubham Sharma, Rohan Mehta, Vipin Kizheppatt
Abstract:
Unmanned aerial vehicles are increasingly being explored as the most promising solution to disaster monitoring, assessment, and recovery. Current relief operations heavily rely on intelligent robot swarms to capture the damage caused, provide timely rescue, and create road maps for the victims. To perform these time-critical missions, efficient path planning that ensures quick coverage of the area is vital. This study aims to develop a technically balanced approach to provide maximum coverage of the affected area in a minimum time using the optimal number of UAVs. A coverage trajectory is designed through area decomposition and task assignment. To perform efficient and autonomous coverage mission, solution to a TSP-based optimization problem using meta-heuristic approaches is designed to allocate waypoints to the UAVs of different flight capacities. The study exploits multi-agent simulations like PX4-SITL and QGroundcontrol through the ROS framework and visualizes the dynamics of UAV deployment to different search paths in a 3D Gazebo environment. Through detailed theoretical analysis and simulation tests, we illustrate the optimality and efficiency of the proposed methodologies.Keywords: area coverage, coverage path planning, heuristic algorithm, mission monitoring, optimization, task assignment, unmanned aerial vehicles
Procedia PDF Downloads 2152454 Numerical Simulation of Transient 3D Temperature and Kerf Formation in Laser Fusion Cutting
Authors: Karim Kheloufi, El Hachemi Amara
Abstract:
In the present study, a three-dimensional transient numerical model was developed to study the temperature field and cutting kerf shape during laser fusion cutting. The finite volume model has been constructed, based on the Navier–Stokes equations and energy conservation equation for the description of momentum and heat transport phenomena, and the Volume of Fluid (VOF) method for free surface tracking. The Fresnel absorption model is used to handle the absorption of the incident wave by the surface of the liquid metal and the enthalpy-porosity technique is employed to account for the latent heat during melting and solidification of the material. To model the physical phenomena occurring at the liquid film/gas interface, including momentum/heat transfer, a new approach is proposed which consists of treating friction force, pressure force applied by the gas jet and the heat absorbed by the cutting front surface as source terms incorporated into the governing equations. All these physics are coupled and solved simultaneously in Fluent CFD®. The main objective of using a transient phase change model in the current case is to simulate the dynamics and geometry of a growing laser-cutting generated kerf until it becomes fully developed. The model is used to investigate the effect of some process parameters on temperature fields and the formed kerf geometry.Keywords: laser cutting, numerical simulation, heat transfer, fluid flow
Procedia PDF Downloads 3392453 Revealing the Potential of Geotourism and Geoheritage of Gedangsari Area, Yogyakarta
Authors: Cecilia Jatu, Adventino
Abstract:
Gedangsari is located in Gunungkidul, Yogyakarta Province, which has several criteria to be used as a new geosite object. The research area is located in the southern mountain zone of Java, composed of 5 rock formations with Oligocene up to Middle Miocene age. The purpose of this study is to reveal the potential of geotourism and the geoheritage to be proposed as a new geosite and to make a geosite map of Gedangsari. The research method used is descriptive data collection and which includes quantitative geological data collection, geotourism, and heritage sites, then supported by petrographic analysis, geological structure, geological mapping, and SWOT analysis. The geological data proved that Gedangsari consists of igneous rock (intrusion), pyroclastic rock, and sediment rock. This condition caused many varieties and particular geomorphological platform. Geotourism that include in Gedangsari are Luweng Sampang Canyon, Gedangsari Bouma Sequence, Watugajah Columnar Joint, Gedangsari Marine Fan Sediment, and Tegalrejo Waterfall. There is also Tegalrejo Village, which can be considered as geoheritage site because of its culture and batik traditional cloth. The results of the SWOT analysis, Gedangsari geosite must be developed and appropriately promoted in order to improve the existence. The development of geosite area will have a significant impact that improve the economic growth of the surrounding community and can be used by the government as base information for sustainable development. In addition, the making of an educational map about the geological conditions and geotourism location of the Gedangsari geosite can increase the people's knowledge about Gedangsari.Keywords: Gedangsari, geoheritage, geotourism, geosite
Procedia PDF Downloads 1232452 High Piezoelectric and Magnetic Performance Achieved in the Lead-free BiFeO3-BaTiO3 Cceramics by Defect Engineering
Authors: Muhammad Habib, Xuefan Zhou, Lin Tang, Guoliang Xue, Fazli Akram, Dou Zhang
Abstract:
Defect engineering approach is a well-established approach for the customization of functional properties of perovskite ceramics. In modern technology, the high multiferroic properties for elevated temperature applications are greatly demanding. In this work, the Bi-nonstoichiometric lead-free 0.67Biy-xSmxFeO3-0.33BaTiO3 ceramics (Sm-doped BF-BT for Bi-excess; y = 1.03 and Bi-deficient; y = 0.975 with x = 0.00, 0.04 and 0.08) were design for the high-temperature multiferroic property. Enhanced piezoelectric (d33 250 pC/N and d33* 350 pm/V) and magnetic properties (Mr 0.25 emu/g) with a high Curie temperature (TC 465 ℃) were obtained in the Bi-deficient pure BF-BT ceramics. With Sm-doping (x = 0.04), the TC decrease to 350 ℃ a significant improvement occurred in the d33* to 504 pm/V and 450 pm/V for Bi-excess and Bi-deficient compositions, respectively. The structural origin of the enhanced piezoelectric strain performance is related to the soft ferroelectric effect by Sm-doping and reversible phase transition from the short-range relaxor ferroelectric state to the long-range order under the applied electric field. However, a slight change occurs in the Mr 0.28 emu/g value with Sm-doping for Bi-deficient ceramics, whereas the Bi-excess ceramics shows completely paramagnetic behavior. Hence, the origin of high magnetic properties in the Bi-deficient BF-BT ceramics is mainly attributed to the proposed double exchange mechanism. We believe that this strategy will provide a new perspective for the development of lead-free multiferroic ceramics for high-temperature applications.Keywords: BiFeO3-BaTiO3, lead-free piezoceramics, magnetic properties, defect engineering
Procedia PDF Downloads 1342451 Evaluation of the Power Generation Effect Obtained by Inserting a Piezoelectric Sheet in the Backlash Clearance of a Circular Arc Helical Gear
Authors: Barenten Suciu, Yuya Nakamoto
Abstract:
Power generation effect, obtained by inserting a piezo- electric sheet in the backlash clearance of a circular arc helical gear, is evaluated. Such type of screw gear is preferred since, in comparison with the involute tooth profile, the circular arc profile leads to reduced stress-concentration effects, and improved life of the piezoelectric film. Firstly, geometry of the circular arc helical gear, and properties of the piezoelectric sheet are presented. Then, description of the test-rig, consisted of a right-hand thread gear meshing with a left-hand thread gear, and the voltage measurement procedure are given. After creating the tridimensional (3D) model of the meshing gears in SolidWorks, they are 3D-printed in acrylonitrile butadiene styrene (ABS) resin. Variation of the generated voltage versus time, during a meshing cycle of the circular arc helical gear, is measured for various values of the center distance. Then, the change of the maximal, minimal, and peak-to-peak voltage versus the center distance is illustrated. Optimal center distance of the gear, to achieve voltage maximization, is found and its significance is discussed. Such results prove that the contact pressure of the meshing gears can be measured, and also, the electrical power can be generated by employing the proposed technique.Keywords: circular arc helical gear, contact problem, optimal center distance, piezoelectric sheet, power generation
Procedia PDF Downloads 1672450 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser
Authors: Guanqiao Wang, Hongyang Yu
Abstract:
There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing
Procedia PDF Downloads 148