Search results for: openings in deep beams
1817 A Smart Contract Project: Peer-to-Peer Energy Trading with Price Forecasting in Microgrid
Authors: Şakir Bingöl, Abdullah Emre Aydemir, Abdullah Saado, Ahmet Akıl, Elif Canbaz, Feyza Nur Bulgurcu, Gizem Uzun, Günsu Bilge Dal, Muhammedcan Pirinççi
Abstract:
Smart contracts, which can be applied in many different areas, from financial applications to the internet of things, come to the fore with their security, low cost, and self-executing features. In this paper, it is focused on peer-to-peer (P2P) energy trading and the implementation of the smart contract on the Ethereum blockchain. It is assumed a microgrid consists of consumers and prosumers that can produce solar and wind energy. The proposed architecture is a system where the prosumer makes the purchase or sale request in the smart contract and the maximum price obtained through the distribution system operator (DSO) by forecasting. It is aimed to forecast the hourly maximum unit price of energy by using deep learning instead of a fixed pricing. In this way, it will make the system more reliable as there will be more dynamic and accurate pricing. For this purpose, Istanbul's energy generation, energy consumption and market clearing price data were used. The consistency of the available data and forecasting results is observed and discussed with graphs.Keywords: energy trading smart contract, deep learning, microgrid, forecasting, Ethereum, peer to peer
Procedia PDF Downloads 1381816 Traditional Values and Their Adaptation in Social Housing Design: Towards a New Typology and Establishment of 'Airhouse' Standard in Malaysia
Authors: Mohd Firrdhaus Mohd Sahabuddin, Cristina Gonzalez-Longo
Abstract:
Large migration from rural areas to urban areas like Kuala Lumpur has led to some implications for economic, social and cultural development. This high population has placed enormous demand on the existing housing stocks, especially for low-income groups. However, some issues arise, one of which is overheated indoor air temperature. This problem contributes to the high-energy usage that forces huge sums of money to be spent on cooling the house by using mechanical equipment. Therefore, this study focuses on thermal comfort in social housing, and incorporates traditional values into its design to achieve a certain measurement of natural ventilation in a house. From the study, the carbon emission and energy consumption for an air-conditioned house is 67%, 66% higher than a naturally ventilated house. Therefore, this research has come up with a new typology design, which has a large exposed wall area and full-length openings on the opposite walls to increase cross ventilation. At the end of this research, the measurement of thermal comfort for a naturally ventilated building called ‘AirHouse’ has been identified.Keywords: tropical architecture, natural ventilation, passive design, AirHouse, social housing design
Procedia PDF Downloads 6761815 Numerical Prediction of Width Crack of Concrete Dapped-End Beams
Authors: Jatziri Y. Moreno-Martinez, Arturo Galvan, Xavier Chavez Cardenas, Hiram Arroyo
Abstract:
Several methods have been utilized to study the prediction of cracking of concrete structural under loading. The finite element analysis is an alternative that shows good results. The aim of this work was the numerical study of the width crack in reinforced concrete beams with dapped ends, these are frequently found in bridge girders and precast concrete construction. Properly restricting cracking is an important aspect of the design in dapped ends, it has been observed that the cracks that exceed the allowable widths are unacceptable in an aggressive environment for reinforcing steel. For simulating the crack width, the discrete crack approach was considered by means of a Cohesive Zone (CZM) Model using a function to represent the crack opening. Two cases of dapped-end were constructed and tested in the laboratory of Structures and Materials of Engineering Institute of UNAM. The first case considers a reinforcement based on hangers as well as on vertical and horizontal ring, the second case considers 50% of the vertical stirrups in the dapped end to the main part of the beam were replaced by an equivalent area (vertically projected) of diagonal bars under. The loading protocol consisted on applying symmetrical loading to reach the service load. The models were performed using the software package ANSYS v. 16.2. The concrete structure was modeled using three-dimensional solid elements SOLID65 capable of cracking in tension and crushing in compression. Drucker-Prager yield surface was used to include the plastic deformations. The reinforcement was introduced with smeared approach. Interface delamination was modeled by traditional fracture mechanics methods such as the nodal release technique adopting softening relationships between tractions and the separations, which in turn introduce a critical fracture energy that is also the energy required to break apart the interface surfaces. This technique is called CZM. The interface surfaces of the materials are represented by a contact elements Surface-to-Surface (CONTA173) with bonded (initial contact). The Mode I dominated bilinear CZM model assumes that the separation of the material interface is dominated by the displacement jump normal to the interface. Furthermore, the opening crack was taken into consideration according to the maximum normal contact stress, the contact gap at the completion of debonding, and the maximum equivalent tangential contact stress. The contact elements were placed in the crack re-entrant corner. To validate the proposed approach, the results obtained with the previous procedure are compared with experimental test. A good correlation between the experimental and numerical Load-Displacement curves was presented, the numerical models also allowed to obtain the load-crack width curves. In these two cases, the proposed model confirms the capability of predicting the maximum crack width, with an error of ± 30 %. Finally, the orientation of the crack is a fundamental for the prediction of crack width. The results regarding the crack width can be considered as good from the practical point view. Load-Displacement curve of the test and the location of the crack were able to obtain favorable results.Keywords: cohesive zone model, dapped-end beams, discrete crack approach, finite element analysis
Procedia PDF Downloads 1671814 A Comparative Study of Twin Delayed Deep Deterministic Policy Gradient and Soft Actor-Critic Algorithms for Robot Exploration and Navigation in Unseen Environments
Authors: Romisaa Ali
Abstract:
This paper presents a comparison between twin-delayed Deep Deterministic Policy Gradient (TD3) and Soft Actor-Critic (SAC) reinforcement learning algorithms in the context of training robust navigation policies for Jackal robots. By leveraging an open-source framework and custom motion control environments, the study evaluates the performance, robustness, and transferability of the trained policies across a range of scenarios. The primary focus of the experiments is to assess the training process, the adaptability of the algorithms, and the robot’s ability to navigate in previously unseen environments. Moreover, the paper examines the influence of varying environmental complexities on the learning process and the generalization capabilities of the resulting policies. The results of this study aim to inform and guide the development of more efficient and practical reinforcement learning-based navigation policies for Jackal robots in real-world scenarios.Keywords: Jackal robot environments, reinforcement learning, TD3, SAC, robust navigation, transferability, custom environment
Procedia PDF Downloads 1021813 End-to-End Pyramid Based Method for Magnetic Resonance Imaging Reconstruction
Authors: Omer Cahana, Ofer Levi, Maya Herman
Abstract:
Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.Keywords: magnetic resonance imaging, image reconstruction, pyramid network, deep learning
Procedia PDF Downloads 911812 Implementation of Data Science in Field of Homologation
Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande
Abstract:
For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)
Procedia PDF Downloads 1631811 F-VarNet: Fast Variational Network for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.Keywords: MRI, deep learning, variational network, computer vision, compress sensing
Procedia PDF Downloads 1611810 Evaluation of the Sustainability of Greek Vernacular Architecture in Different Climate Zones: Architectural Typology and Building Physics
Authors: Christina Kalogirou
Abstract:
Investigating the integration of bioclimatic design into vernacular architecture could lead to interesting results regarding the preservation of cultural heritage while enhancing the energy efficiency of historic buildings. Furthermore, these recognized principles and systems of bioclimatic design in vernacular settlements could be applied to modern architecture and thus to new buildings in such areas. This study introduces an approach to categorizing distinct technologies and design principles of bioclimatic design based on a thoughtful approach to various climatic zones and environment in Greece (mountainous areas, islands and lowlands). For this purpose, various types of dwellings are evaluated for their response to climate, regarding the layout of the buildings (orientation, floor plans’ shape, semi-open spaces), the site planning, the openings (size, position, protection), the building envelope (walls: construction materials-thickness, roof construction detailing) and the migratory living pattern according to seasonal needs. As a result, various passive design principles (that could be adapted to current architectural practice in such areas, in order to optimize the relationship between site, building, climate and energy efficiency) are proposed.Keywords: bioclimatic design, buildings physics, climatic zones, energy efficiency, vernacular architecture
Procedia PDF Downloads 3871809 AI-Based Autonomous Plant Health Monitoring and Control System with Visual Health-Scoring Models
Authors: Uvais Qidwai, Amor Moursi, Mohamed Tahar, Malek Hamad, Hamad Alansi
Abstract:
This paper focuses on the development and implementation of an advanced plant health monitoring system with an AI backbone and IoT sensory network. Our approach involves addressing the critical environmental factors essential for preserving a plant’s well-being, including air temperature, soil moisture, soil temperature, soil conductivity, pH, water levels, and humidity, as well as the presence of essential nutrients like nitrogen, phosphorus, and potassium. Central to our methodology is the utilization of computer vision technology, particularly a night vision camera. The captured data is then compared against a reference database containing different health statuses. This comparative analysis is implemented using an AI deep learning model, which enables us to generate accurate assessments of plant health status. By combining the AI-based decision-making approach, our system aims to provide precise and timely insights into the overall health and well-being of plants, offering a valuable tool for effective plant care and management.Keywords: deep learning image model, IoT sensing, cloud-based analysis, remote monitoring app, computer vision, fuzzy control
Procedia PDF Downloads 541808 Shoring System Selection for Deep Excavation
Authors: Faouzi Ahtchi-Ali, Marcus Vitiello
Abstract:
A study was conducted in the east region of the Middle East to assess the constructability of a shoring system for a 12-meter deep excavation. Several shoring systems were considered in this study including secant concrete piling, contiguous concrete piling, and sheet-piling. The excavation was carried out in a very dense sand with the groundwater level located at 3 meters below ground surface. The study included conducting a pilot test for each shoring system listed above. The secant concrete piling included overlapping concrete piles to a depth of 16 meters. Drilling method with full steel casing was utilized to install the concrete piles. The verticality of the piles was a concern for the overlap. The contiguous concrete piling required the installation of micro-piles to seal the gap between the concrete piles. This method revealed that the gap between the piles was not fully sealed as observed by the groundwater penetration to the excavation. The sheet-piling method required pre-drilling due to the high blow count of the penetrated layer of saturated sand. This study concluded that the sheet-piling method with pre-drilling was the most cost effective and recommended a method for the shoring system.Keywords: excavation, shoring system, middle east, Drilling method
Procedia PDF Downloads 4681807 Analysis of Biomarkers Intractable Epileptogenic Brain Networks with Independent Component Analysis and Deep Learning Algorithms: A Comprehensive Framework for Scalable Seizure Prediction with Unimodal Neuroimaging Data in Pediatric Patients
Authors: Bliss Singhal
Abstract:
Epilepsy is a prevalent neurological disorder affecting approximately 50 million individuals worldwide and 1.2 million Americans. There exist millions of pediatric patients with intractable epilepsy, a condition in which seizures fail to come under control. The occurrence of seizures can result in physical injury, disorientation, unconsciousness, and additional symptoms that could impede children's ability to participate in everyday tasks. Predicting seizures can help parents and healthcare providers take precautions, prevent risky situations, and mentally prepare children to minimize anxiety and nervousness associated with the uncertainty of a seizure. This research proposes a comprehensive framework to predict seizures in pediatric patients by evaluating machine learning algorithms on unimodal neuroimaging data consisting of electroencephalogram signals. The bandpass filtering and independent component analysis proved to be effective in reducing the noise and artifacts from the dataset. Various machine learning algorithms’ performance is evaluated on important metrics such as accuracy, precision, specificity, sensitivity, F1 score and MCC. The results show that the deep learning algorithms are more successful in predicting seizures than logistic Regression, and k nearest neighbors. The recurrent neural network (RNN) gave the highest precision and F1 Score, long short-term memory (LSTM) outperformed RNN in accuracy and convolutional neural network (CNN) resulted in the highest Specificity. This research has significant implications for healthcare providers in proactively managing seizure occurrence in pediatric patients, potentially transforming clinical practices, and improving pediatric care.Keywords: intractable epilepsy, seizure, deep learning, prediction, electroencephalogram channels
Procedia PDF Downloads 841806 Explainable Graph Attention Networks
Authors: David Pham, Yongfeng Zhang
Abstract:
Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.Keywords: explainable AI, graph attention network, graph neural network, node classification
Procedia PDF Downloads 1981805 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks
Authors: Radhika Ranjan Roy
Abstract:
Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve
Procedia PDF Downloads 781804 Blocking of Random Chat Apps at Home Routers for Juvenile Protection in South Korea
Authors: Min Jin Kwon, Seung Won Kim, Eui Yeon Kim, Haeyoung Lee
Abstract:
Numerous anonymous chat apps that help people to connect with random strangers have been released in South Korea. However, they become a serious problem for young people since young people often use them for channels of prostitution or sexual violence. Although ISPs in South Korea are responsible for making inappropriate content inaccessible on their networks, they do not block traffic of random chat apps since 1) the use of random chat apps is entirely legal. 2) it is reported that they use HTTP proxy blocking so that non-HTTP traffic cannot be blocked. In this paper, we propose a service model that can block random chat apps at home routers. A service provider manages a blacklist that contains blocked apps’ information. Home routers that subscribe the service filter the traffic of the apps out using deep packet inspection. We have implemented a prototype of the proposed model, including a centralized server providing the blacklist, a Raspberry Pi-based home router that can filter traffic of the apps out, and an Android app used by the router’s administrator to locally customize the blacklist.Keywords: deep packet inspection, internet filtering, juvenile protection, technical blocking
Procedia PDF Downloads 3491803 Deep Learning for Image Correction in Sparse-View Computed Tomography
Authors: Shubham Gogri, Lucia Florescu
Abstract:
Medical diagnosis and radiotherapy treatment planning using Computed Tomography (CT) rely on the quantitative accuracy and quality of the CT images. At the same time, requirements for CT imaging include reducing the radiation dose exposure to patients and minimizing scanning time. A solution to this is the sparse-view CT technique, based on a reduced number of projection views. This, however, introduces a new problem— the incomplete projection data results in lower quality of the reconstructed images. To tackle this issue, deep learning methods have been applied to enhance the quality of the sparse-view CT images. A first approach involved employing Mir-Net, a dedicated deep neural network designed for image enhancement. This showed promise, utilizing an intricate architecture comprising encoder and decoder networks, along with the incorporation of the Charbonnier Loss. However, this approach was computationally demanding. Subsequently, a specialized Generative Adversarial Network (GAN) architecture, rooted in the Pix2Pix framework, was implemented. This GAN framework involves a U-Net-based Generator and a Discriminator based on Convolutional Neural Networks. To bolster the GAN's performance, both Charbonnier and Wasserstein loss functions were introduced, collectively focusing on capturing minute details while ensuring training stability. The integration of the perceptual loss, calculated based on feature vectors extracted from the VGG16 network pretrained on the ImageNet dataset, further enhanced the network's ability to synthesize relevant images. A series of comprehensive experiments with clinical CT data were conducted, exploring various GAN loss functions, including Wasserstein, Charbonnier, and perceptual loss. The outcomes demonstrated significant image quality improvements, confirmed through pertinent metrics such as Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) between the corrected images and the ground truth. Furthermore, learning curves and qualitative comparisons added evidence of the enhanced image quality and the network's increased stability, while preserving pixel value intensity. The experiments underscored the potential of deep learning frameworks in enhancing the visual interpretation of CT scans, achieving outcomes with SSIM values close to one and PSNR values reaching up to 76.Keywords: generative adversarial networks, sparse view computed tomography, CT image correction, Mir-Net
Procedia PDF Downloads 1611802 Metal Extraction into Ionic Liquids and Hydrophobic Deep Eutectic Mixtures
Authors: E. E. Tereshatov, M. Yu. Boltoeva, V. Mazan, M. F. Volia, C. M. Folden III
Abstract:
Room temperature ionic liquids (RTILs) are a class of liquid organic salts with melting points below 20 °C that are considered to be environmentally friendly ‘designers’ solvents. Pure hydrophobic ILs are known to extract metallic species from aqueous solutions. The closest analogues of ionic liquids are deep eutectic solvents (DESs), which are a eutectic mixture of at least two compounds with a melting point lower than that of each individual component. DESs are acknowledged to be attractive for organic synthesis and metal processing. Thus, these non-volatile and less toxic compounds are of interest for critical metal extraction. The US Department of Energy and the European Commission consider indium as a key metal. Its chemical homologue, thallium, is also an important material for some applications and environmental safety. The aim of this work is to systematically investigate In and Tl extraction from aqueous solutions into pure fluorinated ILs and hydrophobic DESs. The dependence of the Tl extraction efficiency on the structure and composition of the ionic liquid ions, metal oxidation state, and initial metal and aqueous acid concentrations have been studied. The extraction efficiency of the TlXz3–z anionic species (where X = Cl– and/or Br–) is greater for ionic liquids with more hydrophobic cations. Unexpectedly high distribution ratios (> 103) of Tl(III) were determined even by applying a pure ionic liquid as receiving phase. An improved mathematical model based on ion exchange and ion pair formation mechanisms has been developed to describe the co-extraction of two different anionic species, and the relative contributions of each mechanism have been determined. The first evidence of indium extraction into new quaternary ammonium- and menthol-based hydrophobic DESs from hydrochloric and oxalic acid solutions with distribution ratios up to 103 will be provided. Data obtained allow us to interpret the mechanism of thallium and indium extraction into ILs and DESs media. The understanding of Tl and In chemical behavior in these new media is imperative for the further improvement of separation and purification of these elements.Keywords: deep eutectic solvents, indium, ionic liquids, thallium
Procedia PDF Downloads 2411801 A Study on the Application of Machine Learning and Deep Learning Techniques for Skin Cancer Detection
Authors: Hritwik Ghosh, Irfan Sadiq Rahat, Sachi Nandan Mohanty, J. V. R. Ravindra
Abstract:
In the rapidly evolving landscape of medical diagnostics, the early detection and accurate classification of skin cancer remain paramount for effective treatment outcomes. This research delves into the transformative potential of Artificial Intelligence (AI), specifically Deep Learning (DL), as a tool for discerning and categorizing various skin conditions. Utilizing a diverse dataset of 3,000 images representing nine distinct skin conditions, we confront the inherent challenge of class imbalance. This imbalance, where conditions like melanomas are over-represented, is addressed by incorporating class weights during the model training phase, ensuring an equitable representation of all conditions in the learning process. Our pioneering approach introduces a hybrid model, amalgamating the strengths of two renowned Convolutional Neural Networks (CNNs), VGG16 and ResNet50. These networks, pre-trained on the ImageNet dataset, are adept at extracting intricate features from images. By synergizing these models, our research aims to capture a holistic set of features, thereby bolstering classification performance. Preliminary findings underscore the hybrid model's superiority over individual models, showcasing its prowess in feature extraction and classification. Moreover, the research emphasizes the significance of rigorous data pre-processing, including image resizing, color normalization, and segmentation, in ensuring data quality and model reliability. In essence, this study illuminates the promising role of AI and DL in revolutionizing skin cancer diagnostics, offering insights into its potential applications in broader medical domains.Keywords: artificial intelligence, machine learning, deep learning, skin cancer, dermatology, convolutional neural networks, image classification, computer vision, healthcare technology, cancer detection, medical imaging
Procedia PDF Downloads 861800 The Layout Analysis of Handwriting Characters and the Fusion of Multi-style Ancient Books’ Background
Authors: Yaolin Tian, Shanxiong Chen, Fujia Zhao, Xiaoyu Lin, Hailing Xiong
Abstract:
Ancient books are significant culture inheritors and their background textures convey the potential history information. However, multi-style texture recovery of ancient books has received little attention. Restricted by insufficient ancient textures and complex handling process, the generation of ancient textures confronts with new challenges. For instance, training without sufficient data usually brings about overfitting or mode collapse, so some of the outputs are prone to be fake. Recently, image generation and style transfer based on deep learning are widely applied in computer vision. Breakthroughs within the field make it possible to conduct research upon multi-style texture recovery of ancient books. Under the circumstances, we proposed a network of layout analysis and image fusion system. Firstly, we trained models by using Deep Convolution Generative against Networks (DCGAN) to synthesize multi-style ancient textures; then, we analyzed layouts based on the Position Rearrangement (PR) algorithm that we proposed to adjust the layout structure of foreground content; at last, we realized our goal by fusing rearranged foreground texts and generated background. In experiments, diversified samples such as ancient Yi, Jurchen, Seal were selected as our training sets. Then, the performances of different fine-turning models were gradually improved by adjusting DCGAN model in parameters as well as structures. In order to evaluate the results scientifically, cross entropy loss function and Fréchet Inception Distance (FID) are selected to be our assessment criteria. Eventually, we got model M8 with lowest FID score. Compared with DCGAN model proposed by Radford at el., the FID score of M8 improved by 19.26%, enhancing the quality of the synthetic images profoundly.Keywords: deep learning, image fusion, image generation, layout analysis
Procedia PDF Downloads 1571799 Colored Image Classification Using Quantum Convolutional Neural Networks Approach
Authors: Farina Riaz, Shahab Abdulla, Srinjoy Ganguly, Hajime Suzuki, Ravinesh C. Deo, Susan Hopkins
Abstract:
Recently, quantum machine learning has received significant attention. For various types of data, including text and images, numerous quantum machine learning (QML) models have been created and are being tested. Images are exceedingly complex data components that demand more processing power. Despite being mature, classical machine learning still has difficulties with big data applications. Furthermore, quantum technology has revolutionized how machine learning is thought of, by employing quantum features to address optimization issues. Since quantum hardware is currently extremely noisy, it is not practicable to run machine learning algorithms on it without risking the production of inaccurate results. To discover the advantages of quantum versus classical approaches, this research has concentrated on colored image data. Deep learning classification models are currently being created on Quantum platforms, but they are still in a very early stage. Black and white benchmark image datasets like MNIST and Fashion MINIST have been used in recent research. MNIST and CIFAR-10 were compared for binary classification, but the comparison showed that MNIST performed more accurately than colored CIFAR-10. This research will evaluate the performance of the QML algorithm on the colored benchmark dataset CIFAR-10 to advance QML's real-time applicability. However, deep learning classification models have not been developed to compare colored images like Quantum Convolutional Neural Network (QCNN) to determine how much it is better to classical. Only a few models, such as quantum variational circuits, take colored images. The methodology adopted in this research is a hybrid approach by using penny lane as a simulator. To process the 10 classes of CIFAR-10, the image data has been translated into grey scale and the 28 × 28-pixel image containing 10,000 test and 50,000 training images were used. The objective of this work is to determine how much the quantum approach can outperform a classical approach for a comprehensive dataset of color images. After pre-processing 50,000 images from a classical computer, the QCNN model adopted a hybrid method and encoded the images into a quantum simulator for feature extraction using quantum gate rotations. The measurements were carried out on the classical computer after the rotations were applied. According to the results, we note that the QCNN approach is ~12% more effective than the traditional classical CNN approaches and it is possible that applying data augmentation may increase the accuracy. This study has demonstrated that quantum machine and deep learning models can be relatively superior to the classical machine learning approaches in terms of their processing speed and accuracy when used to perform classification on colored classes.Keywords: CIFAR-10, quantum convolutional neural networks, quantum deep learning, quantum machine learning
Procedia PDF Downloads 1291798 Depth of Penetration and Nature of Interferential Current in Cutaneous, Subcutaneous and Muscle Tissues
Authors: A. Beatti, L. Chipchase, A. Rayner, T. Souvlis
Abstract:
The aims of this study were to investigate the depth of interferential current (IFC) penetration through soft tissue and to investigate the area over which IFC spreads during clinical application. Premodulated IFC and ‘true’ IFC at beat frequencies of 4, 40 and 90Hz were applied via four electrodes to the distal medial thigh of 15 healthy subjects. The current was measured via three Teflon coated fine needle electrodes that were inserted into the superficial layer of skin, then into the subcutaneous tissue (≈1 cm deep) and then into muscle tissue (≈2 cm deep). The needle electrodes were placed in the middle of the four IFC electrodes, between two channels and outside the four electrodes. Readings were taken at each tissue depth from each electrode during each treatment frequency then digitized and stored for analysis. All voltages were greater at all depths and locations than baseline (p < 0.01) and voltages decreased with depth (P=0.039). Lower voltages of all currents were recorded in the middle of the four electrodes with the highest voltage being recorded outside the four electrodes in all depths (P=0.000).For each frequency of ‘true’ IFC, the voltage was higher in the superficial layer outside the electrodes (P ≤ 0.01).Premodulated had higher voltages along the line of one circuit (P ≤ 0.01). Clinically, IFC appears to pass through skin layers to depth and is more efficient than premodulated IFC when targeting muscle tissue.Keywords: electrotherapy, interferential current, interferential therapy, medium frequency current
Procedia PDF Downloads 3461797 An Ecological Grandeur: Environmental Ethics in Buddhist Perspective
Authors: Merina Islam
Abstract:
There are many environmental problems. Various counter measures have been taken for environmental problems. Philosophy is an important contributor to environmental studies as it takes deep interest in meaning analysis of the concept environment and other related concepts. The Buddhist frame, which is virtue ethical, remains a better alternative to the traditional environmental outlook. Granting the unique role of man in immoral deliberations, the Buddhist approach, however, maintains a holistic concept of ecological harmony. Buddhist environmental ethics is more concerned about the complete moral community, the total ecosystem, than any particular species within the community. The moral reorientation proposed here has resemblance to the concept of 'deep ecology. Given the present day prominence of virtue ethics, we need to explore further into the Buddhist virtue theory, so that a better framework to treat the natural world would be ensured. Environment has turned out to be one of the most widely discussed issues in the recent times. Buddhist concepts such as Pratityasamutpadavada, Samvrit Satya, Paramartha Satya, Shunyata, Sanghatvada, Bodhisattva, Santanvada and others deal with interdependence in terms of both internal as well external ecology. The internal ecology aims at mental well-being whereas external ecology deals with physical well-being. The fundamental Buddhist concepts for dealing with environmental Problems are where the environment has the same value as humans as from the two Buddhist doctrines of the Non-duality of Life and its Environment and the Origination in Dependence; and the inevitability of overcoming environmental problems through the practice of the way of the Bodhisattva, because environmental problems are evil for people and nature. Buddhism establishes that there is a relationship among all the constituents of the world. There is nothing in the world which is independent from any other thing. Everything is dependent on others. The realization that everything in the universe is mutually interdependent also shows that the man cannot keep itself unaffected from ecology. This paper would like to focus how the Buddhist’s identification of nature and the Dhamma can contribute toward transforming our understanding, attitudes, and actions regarding the care of the earth. Environmental Ethics in Buddhism presents a logical and thorough examination of the metaphysical and ethical dimensions of early Buddhist literature. From the Buddhist viewpoint, humans are not in a category that is distinct and separate from other sentient beings, nor are they intrinsically superior. All sentient beings are considered to have the Buddha-nature, that is, the potential to become fully enlightened. Buddhists do not believe in treating of non-human sentient beings as objects for human consumption. The significance of Buddhist theory of interdependence can be understood from the fact that it shows that one’s happiness or suffering originates from ones realization or non-realization respectively of the dependent nature of everything. It is obvious, even without emphasis, which in the context of deep ecological crisis of today there is a need to infuse the consciousness of interdependence.Keywords: Buddhism, deep ecology, environmental problems, Pratityasamutpadavada
Procedia PDF Downloads 3151796 Evaluation of Greenhouse Covering Materials
Authors: Mouustafa A. Fadel, Ahmed Bani Hammad, Faisal Al Hosany, Osama Iwaimer
Abstract:
Covering materials of greenhouses is the most governing component of the construction which controls two major parameters the amount of light and heat diffused from the surrounding environment into the internal space. In hot areas, balancing between inside and outside the greenhouse consumes most of the energy spent in production systems. In this research, a special testing apparatus was fabricated to simulate the structure of the greenhouse provided with a 400W full spectrum light. Tests were carried out to investigate the effectiveness of different commercial covering material in light and heat diffusion. Twenty one combinations of Fiberglass, Polyethylene, Polycarbonate, Plexiglass and Agril (PP nonwoven fabric) were tested. It was concluded that Plexiglass was the highest in light transparency of 87.4% where the lowest was 33% and 86.8% for Polycarbonate sheets. The enthalpy of the air moving through the testing rig was calculated according to air temperature differences between inlet and outlet openings. The highest enthalpy value was for one layer of Fiberglass and it was 0.81 kj/kg air while it was for both Plexiglass and blocked Fiberglass with a value of 0.5 kj/kg air. It is concluded that, although Plexiglass has high level of transparency which is indeed very helpful under low levels of solar flux, it is not recommended under hot arid conditions where solar flux is available most of the year. On the other hand, it might be a disadvantage to use Plixeglass specially in summer where it helps to accumulate more heat inside the greenhouse.Keywords: greenhouse, covering materials, aridlands, environmental control
Procedia PDF Downloads 4771795 Winkler Springs for Embedded Beams Subjected to S-Waves
Authors: Franco Primo Soffietti, Diego Fernando Turello, Federico Pinto
Abstract:
Shear waves that propagate through the ground impose deformations that must be taken into account in the design and assessment of buried longitudinal structures such as tunnels, pipelines, and piles. Conventional engineering approaches for seismic evaluation often rely on a Euler-Bernoulli beam models supported by a Winkler foundation. This approach, however, falls short in capturing the distortions induced when the structure is subjected to shear waves. To overcome these limitations, in the present work an analytical solution is proposed considering a Timoshenko beam and including transverse and rotational springs. The present research proposes ground springs derived as closed-form analytical solutions of the equations of elasticity including the seismic wavelength. These proposed springs extend the applicability of previous plane-strain models. By considering variations in displacements along the longitudinal direction, the presented approach ensures the springs do not approach zero at low frequencies. This characteristic makes them suitable for assessing pseudo-static cases, which typically govern structural forces in kinematic interaction analyses. The results obtained, validated against existing literature and a 3D Finite Element model, reveal several key insights: i) the cutoff frequency significantly influences transverse and rotational springs; ii) neglecting displacement variations along the structure axis (i.e., assuming plane-strain deformation) results in unrealistically low transverse springs, particularly for wavelengths shorter than the structure length; iii) disregarding lateral displacement components in rotational springs and neglecting variations along the structure axis leads to inaccurately low spring values, misrepresenting interaction phenomena; iv) transverse springs exhibit a notable drop in resonance frequency, followed by increasing damping as frequency rises; v) rotational springs show minor frequency-dependent variations, with radiation damping occurring beyond resonance frequencies, starting from negative values. This comprehensive analysis sheds light on the complex behavior of embedded longitudinal structures when subjected to shear waves and provides valuable insights for the seismic assessment.Keywords: shear waves, Timoshenko beams, Winkler springs, sol-structure interaction
Procedia PDF Downloads 611794 First Occurrence of Histopathological Assessment in Gadoid Deep-Fish Phycis blennoides from the Southwestern Mediterranean Sea
Authors: Zakia Alioua, Amira Soumia, Zerouali-Khodja Fatiha
Abstract:
In spite of a wide variety of contaminants such as heavy metals and organic compounds in addition to the importance of extended pollution, the deep-sea and its species are not in haven and being affected through contaminants exposure. This investigation is performed in order to provide data on the presence of pathological changes in the liver and gonads of the greater forkbeard. A total of 998 specimens of the teleost fish Phycis blennoides Brünnich, 1768 ranged from 5,7 to 62,7 cm in total length, were obtained from the commercial fisheries of Algerian ports. The sampling has been carried out monthly from December 2013 to June 2015 and from January to June 2016 caught by trawlers and longlines between 75 and 600 fathoms in the coast of Algeria. Individuals were sexed their gonads, and their livers were removed and processed for light microscopy and one case of atresia was identified. In whole, overall 0,002% of the specimens presented some degree of liver steatosis. For the gastric section, 442 selected stomachs contents were observed looking for parasitic infestation and enumerate 212 nematodes. A prospecting survey for metal contaminant was performed on the liver by atomic absorption spectrophotometry analysis.Keywords: atresia, coast of Algeria, histopathology, nematode, Phycis blennoides, steatosis
Procedia PDF Downloads 2311793 Quantitative Analysis of Carcinoembryonic Antigen (CEA) Using Micromechanical Piezoresistive Cantilever
Authors: Meisam Omidi, M. Mirijalili, Mohammadmehdi Choolaei, Z. Sharifi, F. Haghiralsadat, F. Yazdian
Abstract:
In this work, we have used arrays of micromechanical piezoresistive cantilever with different geometries to detect carcinoembryonic antigen (CEA), which is known as an important biomarker associated with various cancers such as the colorectal, lung, breast, pancreatic, and bladder cancer. The sensing principle is based on the surface stress changes induced by antigen–antibody interaction on the microcantilevers surfaces. Different concentrations of CEA in a human serum albumin (HSA) solution were detected as a function of the deflection of the beams. According to the experiments, it was revealed that microcantilevers have surface stress sensitivities in the order of 8 (mJ/m). This matter allows them to detect CEA concentrations as low as 3 ng/mL or 18 pM. This indicates the fact that the self-sensing microcantilever approach is beneficial for pathological tests.Keywords: micromechanical biosensors, carcinoembryonic antigen (CEA), surface stress
Procedia PDF Downloads 4721792 A Metallography Study of Secondary A226 Aluminium Alloy Used in Automotive Industries
Authors: Lenka Hurtalová, Eva Tillová, Mária Chalupová, Juraj Belan, Milan Uhríčik
Abstract:
The secondary alloy A226 is used for many automotive casting produced by mould casting and high pressure die-casting. This alloy has excellent castability, good mechanical properties and cost-effectiveness. Production of primary aluminium alloys belong to heavy source fouling of life environs. The European Union calls for the emission reduction and reduction in energy consumption, therefore, increase production of recycled (secondary) aluminium cast alloys. The contribution is deal with influence of recycling on the quality of the casting made from A226 in automotive industry. The properties of the casting made from secondary aluminium alloys were compared with the required properties of primary aluminium alloys. The effect of recycling on microstructure was observed using combination different analytical techniques (light microscopy upon black-white etching, scanning electron microscopy-SEM upon deep etching and energy dispersive X-ray analysis-EDX). These techniques were used for the identification of the various structure parameters, which was used to compare secondary alloy microstructure with primary alloy microstructure.Keywords: A226 secondary aluminium alloy, deep etching, mechanical properties, recycling foundry aluminium alloy
Procedia PDF Downloads 5411791 Image Processing-Based Maize Disease Detection Using Mobile Application
Authors: Nathenal Thomas
Abstract:
In the food chain and in many other agricultural products, corn, also known as maize, which goes by the scientific name Zea mays subsp, is a widely produced agricultural product. Corn has the highest adaptability. It comes in many different types, is employed in many different industrial processes, and is more adaptable to different agro-climatic situations. In Ethiopia, maize is among the most widely grown crop. Small-scale corn farming may be a household's only source of food in developing nations like Ethiopia. The aforementioned data demonstrates that the country's requirement for this crop is excessively high, and conversely, the crop's productivity is very low for a variety of reasons. The most damaging disease that greatly contributes to this imbalance between the crop's supply and demand is the corn disease. The failure to diagnose diseases in maize plant until they are too late is one of the most important factors influencing crop output in Ethiopia. This study will aid in the early detection of such diseases and support farmers during the cultivation process, directly affecting the amount of maize produced. The diseases in maize plants, such as northern leaf blight and cercospora leaf spot, have distinct symptoms that are visible. This study aims to detect the most frequent and degrading maize diseases using the most efficiently used subset of machine learning technology, deep learning so, called Image Processing. Deep learning uses networks that can be trained from unlabeled data without supervision (unsupervised). It is a feature that simulates the exercises the human brain goes through when digesting data. Its applications include speech recognition, language translation, object classification, and decision-making. Convolutional Neural Network (CNN) for Image Processing, also known as convent, is a deep learning class that is widely used for image classification, image detection, face recognition, and other problems. it will also use this algorithm as the state-of-the-art for my research to detect maize diseases by photographing maize leaves using a mobile phone.Keywords: CNN, zea mays subsp, leaf blight, cercospora leaf spot
Procedia PDF Downloads 741790 Improving Axial-Attention Network via Cross-Channel Weight Sharing
Authors: Nazmul Shahadat, Anthony S. Maida
Abstract:
In recent years, hypercomplex inspired neural networks improved deep CNN architectures due to their ability to share weights across input channels and thus improve cohesiveness of representations within the layers. The work described herein studies the effect of replacing existing layers in an Axial Attention ResNet with their quaternion variants that use cross-channel weight sharing to assess the effect on image classification. We expect the quaternion enhancements to produce improved feature maps with more interlinked representations. We experiment with the stem of the network, the bottleneck layer, and the fully connected backend by replacing them with quaternion versions. These modifications lead to novel architectures which yield improved accuracy performance on the ImageNet300k classification dataset. Our baseline networks for comparison were the original real-valued ResNet, the original quaternion-valued ResNet, and the Axial Attention ResNet. Since improvement was observed regardless of which part of the network was modified, there is a promise that this technique may be generally useful in improving classification accuracy for a large class of networks.Keywords: axial attention, representational networks, weight sharing, cross-channel correlations, quaternion-enhanced axial attention, deep networks
Procedia PDF Downloads 831789 Towards Long-Range Pixels Connection for Context-Aware Semantic Segmentation
Authors: Muhammad Zubair Khan, Yugyung Lee
Abstract:
Deep learning has recently achieved enormous response in semantic image segmentation. The previously developed U-Net inspired architectures operate with continuous stride and pooling operations, leading to spatial data loss. Also, the methods lack establishing long-term pixels connection to preserve context knowledge and reduce spatial loss in prediction. This article developed encoder-decoder architecture with bi-directional LSTM embedded in long skip-connections and densely connected convolution blocks. The network non-linearly combines the feature maps across encoder-decoder paths for finding dependency and correlation between image pixels. Additionally, the densely connected convolutional blocks are kept in the final encoding layer to reuse features and prevent redundant data sharing. The method applied batch-normalization for reducing internal covariate shift in data distributions. The empirical evidence shows a promising response to our method compared with other semantic segmentation techniques.Keywords: deep learning, semantic segmentation, image analysis, pixels connection, convolution neural network
Procedia PDF Downloads 1021788 Bacterial Community Diversity in Soil under Two Tillage Systems
Authors: Dalia Ambrazaitienė, Monika Vilkienė, Danute Karcauskienė, Gintaras Siaudinis
Abstract:
The soil is a complex ecosystem that is part of our biosphere. The ability of soil to provide ecosystem services is dependent on microbial diversity. T Tillage is one of the major factors that affect soil properties. The no-till systems or shallow ploughless tillage are opposite of traditional deep ploughing, no-tillage systems, for instance, increase soil organic matter by reducing mineralization rates and stimulating litter concentrations of the top soil layer, whereas deep ploughing increases the biological activity of arable soil layer and reduces the incidence of weeds. The role of soil organisms is central to soil processes. Although the number of microbial species in soil is still being debated, the metagenomic approach to estimate microbial diversity predicted about 2000 – 18 000 bacterial genomes in 1 g of soil. Despite the key role of bacteria in soil processes, there is still lack of information about the bacterial diversity of soils as affected by tillage practices. This study focused on metagenomic analysis of bacterial diversity in long-term experimental plots of Dystric Epihypogleyic Albeluvisols in western part of Lithuania. The experiment was set up in 2013 and had a split-plot design where the whole-plot treatments were laid out in a randomized design with three replicates. The whole-plot treatments consisted of two tillage methods - deep ploughing (22-25 cm) (DP), ploughless tillage (7-10 cm) (PT). Three subsamples (0-20 cm) were collected on October 22, 2015 for each of the three replicates. Subsamples from the DP and PT systems were pooled together wise to make two composition samples, one representing deep ploughing (DP) and the other ploughless tillage (PT). Genomic DNA from soil sample was extracted from approximately 200 mg field-moist soil by using the D6005 Fungal/Bacterial Miniprep set (Zymo Research®) following the manufacturer’s instructions. To determine bacterial diversity and community composition, we employed a culture – independent approach of high-throughput pyrosequencing of the 16S rRNA gene. Metagenomic sequencing was made with Illumina MiSeq platform in Base Clear Company. The microbial component of soil plays a crucial role in cycling of nutrients in biosphere. Our study was a preliminary attempt at observing bacterial diversity in soil under two common but contrasting tillage practices. The number of sequenced reads obtained for PT (161 917) was higher than DP (131 194). The 10 most abundant genus in soil sample were the same (Arthrobacter, Candidatus Saccharibacteria, Actinobacteria, Acidobacterium, Mycobacterium, Bacillus, Alphaproteobacteria, Longilinea, Gemmatimonas, Solirubrobacter), just the percent of community part was different. In DP the Arthrobacter and Acidobacterium consist respectively 8.4 % and 2.5%, meanwhile in PT just 5.8% and 2.1% of all community. The Nocardioides and Terrabacter were observed just in PT. This work was supported by the project VP1-3.1-ŠMM-01-V-03-001 NKPDOKT and National Science Program: The effect of long-term, different-intensity management of resources on the soils of different genesis and on other components of the agro-ecosystems [grant number SIT-9/2015] funded by the Research Council of Lithuania.Keywords: deep ploughing, metagenomics, ploughless tillage, soil community analysis
Procedia PDF Downloads 246