Search results for: deep vibro techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8506

Search results for: deep vibro techniques

7606 Explaining the Relationship between Religiosity and Resilience

Authors: Rita Phillips, Mark Burgess, Maga Berlinski

Abstract:

Although the positive impact of religiosity on well-being, health, and life-coping abilities is well known, up to date research has failed to provide scientific evidence for the relationship reasons. Therefore the present study took a qualitative approach by examining how religiosity interacts in coping with emotionally distressful situations, for which wedding preparations are an example. Wedding preparations, related to the experience of ambiguous emotions, can be the reason for phases of high distress. Although being per-se religious ceremonies, they are also socially-scripted and characterized by people’s striving for personally meaningful celebrations. The negotiation of these many influences can evoke conflicts. To reveal components of religiosity which contribute to stress-resolution, eight biographic-narrative interviews with recently married spouses were conducted. Participants were from different nationalities and Catholic deep-belief communities in order to determine factors independent from national-culture and social-subgroup. The audio-tape recorded, transcribed and translated interviews were analyzed by Interpretative Phenomenological Analysis. Opposing previous research on wedding-related conflicts but in-line with the quantitative account on the relation between stress-resilience and religiosity, the present study found participants reporting very low levels of distress and ambiguity. Although similar areas of potential conflicts were revealed, deep-belief Christians seemed to handle them in a different way. Participants freed themselves from own and others’ rigor mundane expectations by their spiritual preparation and the focus on a divine instance. This evoked a feeling of perceived closeness to God and of unconditional love, resulting in acceptance of oneself and others. Through relativizing mundane goods, participants perceived absolute freedom. Thus belief did not supplement coping strategies, previously defined in the literature, but substituted them. The paper implies that in explaining the connection between stress-resilience and religiosity, one’s perception and experience of unconditional love might outweigh other social or personal factors. However, further qualitative investigations are needed to fully explain the phenomenon.

Keywords: deep-belief, religiosity, resilience, wedding

Procedia PDF Downloads 244
7605 Classification of Foliar Nitrogen in Common Bean (Phaseolus Vulgaris L.) Using Deep Learning Models and Images

Authors: Marcos Silva Tavares, Jamile Raquel Regazzo, Edson José de Souza Sardinha, Murilo Mesquita Baesso

Abstract:

Common beans are a widely cultivated and consumed legume globally, serving as a staple food for humans, especially in developing countries, due to their nutritional characteristics. Nitrogen (N) is the most limiting nutrient for productivity, and foliar analysis is crucial to ensure balanced nitrogen fertilization. Excessive N applications can cause, either isolated or cumulatively, soil and water contamination, plant toxicity, and increase their susceptibility to diseases and pests. However, the quantification of N using conventional methods is time-consuming and costly, demanding new technologies to optimize the adequate supply of N to plants. Thus, it becomes necessary to establish constant monitoring of the foliar content of this macronutrient in plants, mainly at the V4 stage, aiming at precision management of nitrogen fertilization. In this work, the objective was to evaluate the performance of a deep learning model, Resnet-50, in the classification of foliar nitrogen in common beans using RGB images. The BRS Estilo cultivar was sown in a greenhouse in a completely randomized design with four nitrogen doses (T1 = 0 kg N ha-1, T2 = 25 kg N ha-1, T3 = 75 kg N ha-1, and T4 = 100 kg N ha-1) and 12 replications. Pots with 5L capacity were used with a substrate composed of 43% soil (Neossolo Quartzarênico), 28.5% crushed sugarcane bagasse, and 28.5% cured bovine manure. The water supply of the plants was done with 5mm of water per day. The application of urea (45% N) and the acquisition of images occurred 14 and 32 days after sowing, respectively. A code developed in Matlab© R2022b was used to cut the original images into smaller blocks, originating an image bank composed of 4 folders representing the four classes and labeled as T1, T2, T3, and T4, each containing 500 images of 224x224 pixels obtained from plants cultivated under different N doses. The Matlab© R2022b software was used for the implementation and performance analysis of the model. The evaluation of the efficiency was done by a set of metrics, including accuracy (AC), F1-score (F1), specificity (SP), area under the curve (AUC), and precision (P). The ResNet-50 showed high performance in the classification of foliar N levels in common beans, with AC values of 85.6%. The F1 for classes T1, T2, T3, and T4 was 76, 72, 74, and 77%, respectively. This study revealed that the use of RGB images combined with deep learning can be a promising alternative to slow laboratory analyses, capable of optimizing the estimation of foliar N. This can allow rapid intervention by the producer to achieve higher productivity and less fertilizer waste. Future approaches are encouraged to develop mobile devices capable of handling images using deep learning for the classification of the nutritional status of plants in situ.

Keywords: convolutional neural network, residual network 50, nutritional status, artificial intelligence

Procedia PDF Downloads 19
7604 Geology, Geomorphology and Genesis of Andarokh Karstic Cave, North-East Iran

Authors: Mojtaba Heydarizad

Abstract:

Andarokh basin is one of the main karstic regions in Khorasan Razavi province NE Iran. This basin is part of Kopeh-Dagh mega zone extending from Caspian Sea in the east to northern Afghanistan in the west. This basin is covered by Mozdooran Formation, Ngr evaporative formation and quaternary alluvium deposits in descending order of age. Mozdooran carbonate formation is notably karstified. The main surface karstic features in Mozdooran formation are Groove karren, Cleft karren, Rain pit, Rill karren, Tritt karren, Kamintza, Domes, and Table karren. In addition to surface features, deep karstic feature Andarokh Cave also exists in the region. Studying Ca, Mg, Mn, Sr, Fe concentration and Sr/Mn ratio in Mozdooran formation samples with distance to main faults and joints system using PCA analyses demonstrates intense meteoric digenesis role in controlling carbonate rock geochemistry. The karst evaluation in Andarokh basin varies from early stages 'deep seated karst' in Mesozoic to mature karstic system 'Exhumed karst' in quaternary period. Andarokh cave (the main cave in Andarokh basin) is rudimentary branch work consists of three passages of A, B and C and two entrances Andarokh and Sky.

Keywords: Andarokh basin, Andarokh cave, geochemical analyses, karst evaluation

Procedia PDF Downloads 154
7603 Identification of Promising Infant Clusters to Obtain Improved Block Layout Designs

Authors: Mustahsan Mir, Ahmed Hassanin, Mohammed A. Al-Saleh

Abstract:

The layout optimization of building blocks of unequal areas has applications in many disciplines including VLSI floorplanning, macrocell placement, unequal-area facilities layout optimization, and plant or machine layout design. A number of heuristics and some analytical and hybrid techniques have been published to solve this problem. This paper presents an efficient high-quality building-block layout design technique especially suited for solving large-size problems. The higher efficiency and improved quality of optimized solutions are made possible by introducing the concept of Promising Infant Clusters in a constructive placement procedure. The results presented in the paper demonstrate the improved performance of the presented technique for benchmark problems in comparison with published heuristic, analytic, and hybrid techniques.

Keywords: block layout problem, building-block layout design, CAD, optimization, search techniques

Procedia PDF Downloads 386
7602 GIS for Simulating Air Traffic by Applying Different Multi-radar Positioning Techniques

Authors: Amara Rafik, Bougherara Maamar, Belhadj Aissa Mostefa

Abstract:

Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.

Keywords: ATM, GIS, radar data, air traffic simulation

Procedia PDF Downloads 85
7601 Bridging Urban Planning and Environmental Conservation: A Regional Analysis of Northern and Central Kolkata

Authors: Tanmay Bisen, Aastha Shayla

Abstract:

This study introduces an advanced approach to tree canopy detection in urban environments and a regional analysis of Northern and Central Kolkata that delves into the intricate relationship between urban development and environmental conservation. Leveraging high-resolution drone imagery from diverse urban green spaces in Kolkata, we fine-tuned the deep forest model to enhance its precision and accuracy. Our results, characterized by an impressive Intersection over Union (IoU) score of 0.90 and a mean average precision (mAP) of 0.87, underscore the model's robustness in detecting and classifying tree crowns amidst the complexities of aerial imagery. This research not only emphasizes the importance of model customization for specific datasets but also highlights the potential of drone-based remote sensing in urban forestry studies. The study investigates the spatial distribution, density, and environmental impact of trees in Northern and Central Kolkata. The findings underscore the significance of urban green spaces in met-ropolitan cities, emphasizing the need for sustainable urban planning that integrates green infrastructure for ecological balance and human well-being.

Keywords: urban greenery, advanced spatial distribution analysis, drone imagery, deep learning, tree detection

Procedia PDF Downloads 56
7600 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking

Authors: Noga Bregman

Abstract:

Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.

Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves

Procedia PDF Downloads 52
7599 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing

Procedia PDF Downloads 259
7598 Vision-Based Collision Avoidance for Unmanned Aerial Vehicles by Recurrent Neural Networks

Authors: Yao-Hong Tsai

Abstract:

Due to the sensor technology, video surveillance has become the main way for security control in every big city in the world. Surveillance is usually used by governments for intelligence gathering, the prevention of crime, the protection of a process, person, group or object, or the investigation of crime. Many surveillance systems based on computer vision technology have been developed in recent years. Moving target tracking is the most common task for Unmanned Aerial Vehicle (UAV) to find and track objects of interest in mobile aerial surveillance for civilian applications. The paper is focused on vision-based collision avoidance for UAVs by recurrent neural networks. First, images from cameras on UAV were fused based on deep convolutional neural network. Then, a recurrent neural network was constructed to obtain high-level image features for object tracking and extracting low-level image features for noise reducing. The system distributed the calculation of the whole system to local and cloud platform to efficiently perform object detection, tracking and collision avoidance based on multiple UAVs. The experiments on several challenging datasets showed that the proposed algorithm outperforms the state-of-the-art methods.

Keywords: unmanned aerial vehicle, object tracking, deep learning, collision avoidance

Procedia PDF Downloads 160
7597 Predicting Costs in Construction Projects with Machine Learning: A Detailed Study Based on Activity-Level Data

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: cost prediction, machine learning, project management, random forest, neural networks

Procedia PDF Downloads 54
7596 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications

Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley

Abstract:

Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.

Keywords: batteries, energy, iron, nickel, storage

Procedia PDF Downloads 439
7595 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines

Authors: K. Shaji Mon, P. R. John Sreenidhi

Abstract:

In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.

Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer

Procedia PDF Downloads 245
7594 Dynamic Distribution Calibration for Improved Few-Shot Image Classification

Authors: Majid Habib Khan, Jinwei Zhao, Xinhong Hei, Liu Jiedong, Rana Shahzad Noor, Muhammad Imran

Abstract:

Deep learning is increasingly employed in image classification, yet the scarcity and high cost of labeled data for training remain a challenge. Limited samples often lead to overfitting due to biased sample distribution. This paper introduces a dynamic distribution calibration method for few-shot learning. Initially, base and new class samples undergo normalization to mitigate disparate feature magnitudes. A pre-trained model then extracts feature vectors from both classes. The method dynamically selects distribution characteristics from base classes (both adjacent and remote) in the embedding space, using a threshold value approach for new class samples. Given the propensity of similar classes to share feature distributions like mean and variance, this research assumes a Gaussian distribution for feature vectors. Subsequently, distributional features of new class samples are calibrated using a corrected hyperparameter, derived from the distribution features of both adjacent and distant base classes. This calibration augments the new class sample set. The technique demonstrates significant improvements, with up to 4% accuracy gains in few-shot classification challenges, as evidenced by tests on miniImagenet and CUB datasets.

Keywords: deep learning, computer vision, image classification, few-shot learning, threshold

Procedia PDF Downloads 66
7593 An Automatic Speech Recognition of Conversational Telephone Speech in Malay Language

Authors: M. Draman, S. Z. Muhamad Yassin, M. S. Alias, Z. Lambak, M. I. Zulkifli, S. N. Padhi, K. N. Baharim, F. Maskuriy, A. I. A. Rahim

Abstract:

The performance of Malay automatic speech recognition (ASR) system for the call centre environment is presented. The system utilizes Kaldi toolkit as the platform to the entire library and algorithm used in performing the ASR task. The acoustic model implemented in this system uses a deep neural network (DNN) method to model the acoustic signal and the standard (n-gram) model for language modelling. With 80 hours of training data from the call centre recordings, the ASR system can achieve 72% of accuracy that corresponds to 28% of word error rate (WER). The testing was done using 20 hours of audio data. Despite the implementation of DNN, the system shows a low accuracy owing to the varieties of noises, accent and dialect that typically occurs in Malaysian call centre environment. This significant variation of speakers is reflected by the large standard deviation of the average word error rate (WERav) (i.e., ~ 10%). It is observed that the lowest WER (13.8%) was obtained from recording sample with a standard Malay dialect (central Malaysia) of native speaker as compared to 49% of the sample with the highest WER that contains conversation of the speaker that uses non-standard Malay dialect.

Keywords: conversational speech recognition, deep neural network, Malay language, speech recognition

Procedia PDF Downloads 322
7592 Interactive Effects of Challenge-Hindrance Stressors and Core Self-Evaluations on In-Role and Extra-Role Performance

Authors: Khansa Hayat

Abstract:

Organizational stress is one of the vital phenomena which is having its roots deep down in has deep roots in management, psychology, and organizational behavior research. In the meanwhile, keeping its focus on the positive strength of humans rather than the traditional negativity oriented research, positive psychology has emerged as a separate branch of organizational behavior. The current study investigates the interactive effects of Challenge and hindrance stressors and core Self Evaluations (CSE’s) of the individual on job performances including the in-role performance and extra role performances. The study also aims to investigate the supporting/buffering role of the human dispositions (i.e., self esteem, self efficacy, locus of control and emotional stability). The results show that Challenge stressors have a significant positive effect on in role performance and extra role performance of the individual. The findings of the study indicate that Core Self evaluations strengthen the relationship between challenge stressors and in role performance of the individual. In case of Hindrance Stressors the Core self Evaluations lessen the negative impact of Hindrance stressors and they let the individual perform at a better and normal position even when the Hindrance stressors are high. The relationship and implication of conservation of resource theory are also discussed. The limitations, future research directions and implications of the study are also discussed.

Keywords: challenge-hindrance stressors, core self evaluations, in-role performance, extra-role performance

Procedia PDF Downloads 277
7591 A Transformer-Based Question Answering Framework for Software Contract Risk Assessment

Authors: Qisheng Hu, Jianglei Han, Yue Yang, My Hoa Ha

Abstract:

When a company is considering purchasing software for commercial use, contract risk assessment is critical to identify risks to mitigate the potential adverse business impact, e.g., security, financial and regulatory risks. Contract risk assessment requires reviewers with specialized knowledge and time to evaluate the legal documents manually. Specifically, validating contracts for a software vendor requires the following steps: manual screening, interpreting legal documents, and extracting risk-prone segments. To automate the process, we proposed a framework to assist legal contract document risk identification, leveraging pre-trained deep learning models and natural language processing techniques. Given a set of pre-defined risk evaluation problems, our framework utilizes the pre-trained transformer-based models for question-answering to identify risk-prone sections in a contract. Furthermore, the question-answering model encodes the concatenated question-contract text and predicts the start and end position for clause extraction. Due to the limited labelled dataset for training, we leveraged transfer learning by fine-tuning the models with the CUAD dataset to enhance the model. On a dataset comprising 287 contract documents and 2000 labelled samples, our best model achieved an F1 score of 0.687.

Keywords: contract risk assessment, NLP, transfer learning, question answering

Procedia PDF Downloads 129
7590 A Machine Learning Approach for Efficient Resource Management in Construction Projects

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management

Procedia PDF Downloads 39
7589 Exploring the Concept of Fashion Waste: Hanging by a Thread

Authors: Timothy Adam Boleratzky

Abstract:

The goal of this transformative endeavour lies in the repurposing of textile scraps, heralding a renaissance in the creation of wearable art. Through a judicious fusion of Life Cycle Assessment (LCA) methodologies and cutting-edge techniques, this research embarks upon a voyage of exploration, unraveling the intricate tapestry of environmental implications woven into the fabric of textile waste. Delving deep into the annals of empirical evidence and scholarly discourse, the study not only elucidates the urgent imperative for waste reduction strategies but also unveils the transformative potential inherent in embracing circular economy principles within the hallowed halls of fashion. As the research unfurls its sails, guided by the compass of sustainability, it traverses uncharted territories, charting a course toward a more enlightened and responsible fashion ecosystem. The canvas upon which this journey unfolds is richly adorned with insights gleaned from the crucible of experimentation, laying bare the myriad pathways toward waste minimisation and resource optimisation. From the adoption of recycling strategies to the cultivation of eco-friendly production techniques, the research endeavours to sculpt a blueprint for a more sustainable future, one stitch at a time. In this unfolding narrative, the role of wearable art emerges as a potent catalyst for change, transcending the boundaries of conventional fashion to embrace a more holistic ethos of sustainability. Through the alchemy of creativity and craftsmanship, discarded textile scraps are imbued with new life, morphing into exquisite creations that serve as both a testament to human ingenuity and a rallying cry for environmental preservation. Each thread, each stitch, becomes a silent harbinger of change, weaving together a tapestry of hope in a world besieged by ecological uncertainty. As the research journey culminates, its echoes resonate far beyond the confines of academia, reverberating through the corridors of industry and beyond. In its wake, it leaves a legacy of empowerment and enlightenment, inspiring a generation of designers, entrepreneurs, and consumers to embrace a more sustainable vision of fashion. For in the intricate interplay of threads and textiles lies the promise of a brighter, more resilient future, where beauty coexists harmoniously with responsibility and where fashion becomes not merely an expression of style but a celebration of sustainability.

Keywords: fabric-manipulation, sustainability, textiles, waste, wearable-art

Procedia PDF Downloads 43
7588 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 194
7587 Monte Carlo Simulations of LSO/YSO for Dose Evaluation in Photon Beam Radiotherapy

Authors: H. Donya

Abstract:

Monte Carlo (MC) techniques play a fundamental role in radiotherapy. A two non-water-equivalent of different media were used to evaluate the dose in water. For such purpose, Lu2SiO5 (LSO) and Y2SiO5 (YSO) orthosilicates scintillators are chosen for MC simulation using Penelope code. To get higher efficiency in dose calculation, variance reduction techniques are discussed. Overall results of this investigation ensured that the LSO/YSO bi-media a good combination to tackle over-response issue in dynamic photon radiotherapy.

Keywords: Lu2SiO5 (LSO) and Y2SiO5 (YSO) orthosilicates, Monte Carlo, correlated sampling, radiotherapy

Procedia PDF Downloads 407
7586 Stabilizing Effects of Deep Eutectic Solvents on Alcohol Dehydrogenase Mediated Systems

Authors: Fatima Zohra Ibn Majdoub Hassani, Ivan Lavandera, Joseph Kreit

Abstract:

This study explored the effects of different organic solvents, temperature, and the amount of glycerol on the alcohol dehydrogenase (ADH)-catalysed stereoselective reduction of different ketones. These conversions were then analyzed by gas chromatography. It was found that when the amount of deep eutectic solvents (DES) increases, it can improve the stereoselectivity of the enzyme although reducing its ability to convert the substrate into the corresponding alcohol. Moreover, glycerol was found to have a strong stabilizing effect on the ADH from Ralstonia sp. (E. coli/ RasADH). In the case of organic solvents, it was observed that the best conversions into the alcohols were achieved with DMSO and hexane. It was also observed that temperature decreased the ability of the enzyme to convert the substrates into the products and also affected the selectivity. In addition to that, the recycling of DES up to three times gave good conversions and enantiomeric excess results and glycerol showed a positive effect in the stability of various ADHs. Using RasADH, a good conversion and enantiomeric excess into the S-alcohol were obtained. It was found that an enhancement of the temperature disabled the stabilizing effect of glycerol and decreased the stereoselectivity of the enzyme. However, for other ADHs a temperature increase had an opposite positive effect, especially with ADH-T from Thermoanaerobium sp. One of the objectives of this study was to see the effect of cofactors such as NAD(P) on the biocatlysis activities of ADHs.

Keywords: alcohol dehydrogenases, DES, gas chromatography, RasADH

Procedia PDF Downloads 193
7585 Optimization Techniques of Doubly-Fed Induction Generator Controller Design for Reliability Enhancement of Wind Energy Conversion Systems

Authors: Om Prakash Bharti, Aanchal Verma, R. K. Saket

Abstract:

The Doubly-Fed Induction Generator (DFIG) is suggested for Wind Energy Conversion System (WECS) to extract wind power. DFIG is preferably employed due to its robustness towards variable wind and rotor speed. DFIG has the adaptable property because the system parameters are smoothly dealt with, including real power, reactive power, DC-link voltage, and the transient and dynamic responses, which are needed to analyze constantly. The analysis becomes more prominent during any unusual condition in the electrical power system. Hence, the study and improvement in the system parameters and transient response performance of DFIG are required to be accomplished using some controlling techniques. For fulfilling the task, the present work implements and compares the optimization methods for the design of the DFIG controller for WECS. The bio-inspired optimization techniques are applied to get the optimal controller design parameters for DFIG-based WECS. The optimized DFIG controllers are then used to retrieve the transient response performance of the six-order DFIG model with a step input. The results using MATLAB/Simulink show the betterment of the Firefly algorithm (FFA) over other control techniques when compared with the other controller design methods.

Keywords: doubly-fed induction generator, wind turbine, wind energy conversion system, induction generator, transfer function, proportional, integral, derivatives

Procedia PDF Downloads 93
7584 The Role of Sustainable Development in the Design and Planning of Smart Cities Using GIS Techniques: Models of Arab Cities

Authors: Ahmed M. Jihad

Abstract:

The paper presents the concept of sustainable development, and the role of geographic techniques in the design, planning and presentation of maps of smart cities with geographical vision, and the identification of programs and tools, and models of maps of Arab cities, is the problem of research in how to apply, process and experience these programs? What is the role of geographic techniques in planning and mapping the optimal place for these cities? The paper proposes an addition to the designs of Iraqi cities, as it can be developed in the future to serve as a model for interactive smart cities by developing its services. The importance of this paper stems from the concept of sustainable development dynamic which has become a method of development imposed by the present era in rapid development to achieve social balance and specialized programs in draw paper argues that ensuring sustainable development is achieved through the use of information technology. The paper will follow the theoretical presentation of the importance of the concept of development, design tools and programs. The paper follows the method of analysis of modern systems (System Analysis Approach) through the latest programs will provide results can be said that the new Iraqi cities can be developed with smart technologies, like some of the Arab and European cities that were newly created through the introduction of international investment, and therefore Plans can be made to select the best programs in manufacturing and producing maps and smart cities in the future.

Keywords: geographic techniques, planning the cities, smart cities, sustainable development

Procedia PDF Downloads 210
7583 The Application of Nuclear Energy for Sustainable Agriculture and Food Security: A Review

Authors: Gholamreza Farrokhi, Behzad Sani

Abstract:

The goals of sustainable agricultural are development, improved nutrition, and food security. Sustainable agriculture must be developed that will meet today’s needs for food and other products, as well as preserving the vital natural resource base that will allow future generations to meet their needs. Sustainable development requires international cooperation and the effective use of technology. Access to sustainable sources of food will remain a preeminent challenge in the decades to come. Based upon current practice and consumption, agricultural production will have to increase by about 70% by 2050 to meet demand. Nuclear techniques are used in developing countries to increase production sustainably by breeding improved crops, enhancing livestock reproduction and nutrition, as well as controlling animal and plant pests and diseases. Post-harvest losses can be reduced and safety increased with nuclear technology. Soil can be evaluated with nuclear techniques to conserve and improve soil productivity and water management.

Keywords: food safety, food security, nuclear techniques, sustainable agriculture, sustainable future

Procedia PDF Downloads 357
7582 Predicting Subsurface Abnormalities Growth Using Physics-Informed Neural Networks

Authors: Mehrdad Shafiei Dizaji, Hoda Azari

Abstract:

The research explores the pioneering integration of Physics-Informed Neural Networks (PINNs) into the domain of Ground-Penetrating Radar (GPR) data prediction, akin to advancements in medical imaging for tracking tumor progression in the human body. This research presents a detailed development framework for a specialized PINN model proficient at interpreting and forecasting GPR data, much like how medical imaging models predict tumor behavior. By harnessing the synergy between deep learning algorithms and the physical laws governing subsurface structures—or, in medical terms, human tissues—the model effectively embeds the physics of electromagnetic wave propagation into its architecture. This ensures that predictions not only align with fundamental physical principles but also mirror the precision needed in medical diagnostics for detecting and monitoring tumors. The suggested deep learning structure comprises three components: a CNN, a spatial feature channel attention (SFCA) mechanism, and ConvLSTM, along with temporal feature frame attention (TFFA) modules. The attention mechanism computes channel attention and temporal attention weights using self-adaptation, thereby fine-tuning the visual and temporal feature responses to extract the most pertinent and significant visual and temporal features. By integrating physics directly into the neural network, our model has shown enhanced accuracy in forecasting GPR data. This improvement is vital for conducting effective assessments of bridge deck conditions and other evaluations related to civil infrastructure. The use of Physics-Informed Neural Networks (PINNs) has demonstrated the potential to transform the field of Non-Destructive Evaluation (NDE) by enhancing the precision of infrastructure deterioration predictions. Moreover, it offers a deeper insight into the fundamental mechanisms of deterioration, viewed through the prism of physics-based models.

Keywords: physics-informed neural networks, deep learning, ground-penetrating radar (GPR), NDE, ConvLSTM, physics, data driven

Procedia PDF Downloads 40
7581 Effective Supply Chain Coordination with Hybrid Demand Forecasting Techniques

Authors: Gurmail Singh

Abstract:

Effective supply chain is the main priority of every organization which is the outcome of strategic corporate investments with deliberate management action. Value-driven supply chain is defined through development, procurement and by configuring the appropriate resources, metrics and processes. However, responsiveness of the supply chain can be improved by proper coordination. So the Bullwhip effect (BWE) and Net stock amplification (NSAmp) values were anticipated and used for the control of inventory in organizations by both discrete wavelet transform-Artificial neural network (DWT-ANN) and Adaptive Network-based fuzzy inference system (ANFIS). This work presents a comparative methodology of forecasting for the customers demand which is non linear in nature for a multilevel supply chain structure using hybrid techniques such as Artificial intelligence techniques including Artificial neural networks (ANN) and Adaptive Network-based fuzzy inference system (ANFIS) and Discrete wavelet theory (DWT). The productiveness of these forecasting models are shown by computing the data from real world problems for Bullwhip effect and Net stock amplification. The results showed that these parameters were comparatively less in case of discrete wavelet transform-Artificial neural network (DWT-ANN) model and using Adaptive network-based fuzzy inference system (ANFIS).

Keywords: bullwhip effect, hybrid techniques, net stock amplification, supply chain flexibility

Procedia PDF Downloads 127
7580 Prediction of Music Track Popularity: A Machine Learning Approach

Authors: Syed Atif Hassan, Luv Mehta, Syed Asif Hassan

Abstract:

Hit song science is a field of investigation wherein machine learning techniques are applied to music tracks in order to extract such features from audio signals which can capture information that could explain the popularity of respective tracks. Record companies invest huge amounts of money into recruiting fresh talents and churning out new music each year. Gaining insight into the basis of why a song becomes popular will result in tremendous benefits for the music industry. This paper aims to extract basic musical and more advanced, acoustic features from songs while also taking into account external factors that play a role in making a particular song popular. We use a dataset derived from popular Spotify playlists divided by genre. We use ten genres (blues, classical, country, disco, hip-hop, jazz, metal, pop, reggae, rock), chosen on the basis of clear to ambiguous delineation in the typical sound of their genres. We feed these features into three different classifiers, namely, SVM with RBF kernel, a deep neural network, and a recurring neural network, to build separate predictive models and choosing the best performing model at the end. Predicting song popularity is particularly important for the music industry as it would allow record companies to produce better content for the masses resulting in a more competitive market.

Keywords: classifier, machine learning, music tracks, popularity, prediction

Procedia PDF Downloads 663
7579 Inversely Designed Chipless Radio Frequency Identification (RFID) Tags Using Deep Learning

Authors: Madhawa Basnayaka, Jouni Paltakari

Abstract:

Fully passive backscattering chipless RFID tags are an emerging wireless technology with low cost, higher reading distance, and fast automatic identification without human interference, unlike already available technologies like optical barcodes. The design optimization of chipless RFID tags is crucial as it requires replacing integrated chips found in conventional RFID tags with printed geometric designs. These designs enable data encoding and decoding through backscattered electromagnetic (EM) signatures. The applications of chipless RFID tags have been limited due to the constraints of data encoding capacity and the ability to design accurate yet efficient configurations. The traditional approach to accomplishing design parameters for a desired EM response involves iterative adjustment of design parameters and simulating until the desired EM spectrum is achieved. However, traditional numerical simulation methods encounter limitations in optimizing design parameters efficiently due to the speed and resource consumption. In this work, a deep learning neural network (DNN) is utilized to establish a correlation between the EM spectrum and the dimensional parameters of nested centric rings, specifically square and octagonal. The proposed bi-directional DNN has two simultaneously running neural networks, namely spectrum prediction and design parameters prediction. First, spectrum prediction DNN was trained to minimize mean square error (MSE). After the training process was completed, the spectrum prediction DNN was able to accurately predict the EM spectrum according to the input design parameters within a few seconds. Then, the trained spectrum prediction DNN was connected to the design parameters prediction DNN and trained two networks simultaneously. For the first time in chipless tag design, design parameters were predicted accurately after training bi-directional DNN for a desired EM spectrum. The model was evaluated using a randomly generated spectrum and the tag was manufactured using the predicted geometrical parameters. The manufactured tags were successfully tested in the laboratory. The amount of iterative computer simulations has been significantly decreased by this approach. Therefore, highly efficient but ultrafast bi-directional DNN models allow rapid and complicated chipless RFID tag designs.

Keywords: artificial intelligence, chipless RFID, deep learning, machine learning

Procedia PDF Downloads 50
7578 Digital Watermarking Using Fractional Transform and (k,n) Halftone Visual Cryptography (HVC)

Authors: R. Rama Kishore, Sunesh Malik

Abstract:

Development in the usage of internet for different purposes in recent times creates great threat for the copy right protection of the digital images. Digital watermarking is the best way to rescue from the said problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field and categorized like spatial and transform domain, blind and non-blind methods, visible and non visible techniques etc. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (k.n) shares of halftone visual cryptography (HVC) instead of (2, 2) share cryptography. (k,n) shares visual cryptography improves the security of the watermark. As halftone is a method of reprographic, it helps in improving the visual quality of watermark image. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method.

Keywords: digital watermarking, fractional transform, halftone, visual cryptography

Procedia PDF Downloads 355
7577 Embedded Hw-Sw Reconfigurable Techniques For Wireless Sensor Network Applications

Authors: B. Kirubakaran, C. Rajasekaran

Abstract:

Reconfigurable techniques are used in many engineering and industrial applications for the efficient data transmissions through the wireless sensor networks. Nowadays most of the industrial applications are work for try to minimize the size and cost. During runtime the reconfigurable technique avoid the unwanted hang and delay in the system performance. In recent world Field Programmable Gate Array (FPGA) as one of the most efficient reconfigurable device and widely used for most of the hardware and software reconfiguration applications. In this paper, the work deals with whatever going to make changes in the hardware and software during runtime it’s should not affect the current running process that’s the main objective of the paper our changes be done in a parallel manner at the same time concentrating the cost and power transmission problems during data trans-receiving. Analog sensor (Temperature) as an input for the controller (PIC) through that control the FPGA digital sensors in generalized manner.

Keywords: field programmable gate array, peripheral interrupt controller, runtime reconfigurable techniques, wireless sensor networks

Procedia PDF Downloads 407