Search results for: accurate tagging algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5810

Search results for: accurate tagging algorithm

1550 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 75
1549 Detecting Manipulated Media Using Deep Capsule Network

Authors: Joseph Uzuazomaro Oju

Abstract:

The ease at which manipulated media can be created, and the increasing difficulty in identifying fake media makes it a great threat. Most of the applications used for the creation of these high-quality fake videos and images are built with deep learning. Hence, the use of deep learning in creating a detection mechanism cannot be overemphasized. Any successful fake media that is being detected before it reached the populace will save people from the self-doubt of either a content is genuine or fake and will ensure the credibility of videos and images. The methodology introduced in this paper approaches the manipulated media detection challenge using a combo of VGG-19 and a deep capsule network. In the case of videos, they are converted into frames, which, in turn, are resized and cropped to the face region. These preprocessed images/videos are fed to the VGG-19 network to extract the latent features. The extracted latent features are inputted into a deep capsule network enhanced with a 3D -convolution dynamic routing agreement. The 3D –convolution dynamic routing agreement algorithm helps to reduce the linkages between capsules networks. Thereby limiting the poor learning shortcoming of multiple capsule network layers. The resultant output from the deep capsule network will indicate a media to be either genuine or fake.

Keywords: deep capsule network, dynamic routing, fake media detection, manipulated media

Procedia PDF Downloads 131
1548 A Novel Study Contrasting Traditional Autopsy with Post-Mortem Computed Tomography in Falls Leading to Death

Authors: Balaji Devanathan, Gokul G., Abilash S., Abhishek Yadav, Sudhir K. Gupta

Abstract:

Background: As an alternative to the traditional autopsy, a virtual autopsy is carried out using scanning and imaging technologies, mainly post-mortem computed tomography (PMCT). This facility aims to supplement traditional autopsy results and reduce or eliminate internal dissection in subsequent autopsies. For emotional and religious reasons, the deceased's relatives have historically disapproved such interior dissection. The non-invasive, objective, and preservative PMCT is what friends and family would rather have than a traditional autopsy. Additionally, it aids in the examination of the technologies and the benefits and drawbacks of each, demonstrating the significance of contemporary imaging in the field of forensic medicine. Results: One hundred falls resulting in fatalities was analysed by the writers. Before the autopsy, each case underwent a PMCT examination using a 16-slice Multi-Slice CT spiral scanner. By using specialised software, MPR and VR reconstructions were carried out following the capture of the raw images. The accurate detection of fractures in the skull, face bones, clavicle, scapula, and vertebra was better observed in comparison to a routine autopsy. The interpretation of pneumothorax, Pneumoperitoneum, pneumocephalus, and hemosiuns are much enhanced by PMCT than traditional autopsy. Conclusion. It is useful to visualise the skeletal damage in fall from height cases using a virtual autopsy based on PMCT. So, the ideal tool in traumatising patients is a virtual autopsy based on PMCT scans. When assessing trauma victims, PMCT should be viewed as an additional helpful tool to traditional autopsy. This is because it can identify additional bone fractures in body parts that are challenging to examine during autopsy, such as posterior regions, which helps the pathologist reconstruct the victim's life and determine the cause of death.

Keywords: PMCT, fall from height, autopsy, fracture

Procedia PDF Downloads 37
1547 An Efficient FPGA Realization of Fir Filter Using Distributed Arithmetic

Authors: M. Iruleswari, A. Jeyapaul Murugan

Abstract:

Most fundamental part used in many Digital Signal Processing (DSP) application is a Finite Impulse Response (FIR) filter because of its linear phase, stability and regular structure. Designing a high-speed and hardware efficient FIR filter is a very challenging task as the complexity increases with the filter order. In most applications the higher order filters are required but the memory usage of the filter increases exponentially with the order of the filter. Using multipliers occupy a large chip area and need high computation time. Multiplier-less memory-based techniques have gained popularity over past two decades due to their high throughput processing capability and reduced dynamic power consumption. This paper describes the design and implementation of highly efficient Look-Up Table (LUT) based circuit for the implementation of FIR filter using Distributed arithmetic algorithm. It is a multiplier less FIR filter. The LUT can be subdivided into a number of LUT to reduce the memory usage of the LUT for higher order filter. Analysis on the performance of various filter orders with different address length is done using Xilinx 14.5 synthesis tool. The proposed design provides less latency, less memory usage and high throughput.

Keywords: finite impulse response, distributed arithmetic, field programmable gate array, look-up table

Procedia PDF Downloads 455
1546 Software Development to Empowering Digital Libraries with Effortless Digital Cataloging and Access

Authors: Abdul Basit Kiani

Abstract:

The software for the digital library system is a cutting-edge solution designed to revolutionize the way libraries manage and provide access to their vast collections of digital content. This advanced software leverages the power of technology to offer a seamless and user-friendly experience for both library staff and patrons. By implementing this software, libraries can efficiently organize, store, and retrieve digital resources, including e-books, audiobooks, journals, articles, and multimedia content. Its intuitive interface allows library staff to effortlessly manage cataloging, metadata extraction, and content enrichment, ensuring accurate and comprehensive access to digital materials. For patrons, the software offers a personalized and immersive digital library experience. They can easily browse the digital catalog, search for specific items, and explore related content through intelligent recommendation algorithms. The software also facilitates seamless borrowing, lending, and preservation of digital items, enabling users to access their favorite resources anytime, anywhere, on multiple devices. With robust security features, the software ensures the protection of intellectual property rights and enforces access controls to safeguard sensitive content. Integration with external authentication systems and user management tools streamlines the library's administration processes, while advanced analytics provide valuable insights into patron behavior and content usage. Overall, this software for the digital library system empowers libraries to embrace the digital era, offering enhanced access, convenience, and discoverability of their vast collections. It paves the way for a more inclusive and engaging library experience, catering to the evolving needs of tech-savvy patrons.

Keywords: software development, empowering digital libraries, digital cataloging and access, management system

Procedia PDF Downloads 79
1545 Multi-Agent System for Irrigation Using Fuzzy Logic Algorithm and Open Platform Communication Data Access

Authors: T. Wanyama, B. Far

Abstract:

Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.

Keywords: community water usage, fuzzy logic, irrigation, multi-agent system

Procedia PDF Downloads 297
1544 An Improved Method on Static Binary Analysis to Enhance the Context-Sensitive CFI

Authors: Qintao Shen, Lei Luo, Jun Ma, Jie Yu, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Control Flow Integrity (CFI) is one of the most promising technique to defend Code-Reuse Attacks (CRAs). Traditional CFI Systems and recent Context-Sensitive CFI use coarse control flow graphs (CFGs) to analyze whether the control flow hijack occurs, left vast space for attackers at indirect call-sites. Coarse CFGs make it difficult to decide which target to execute at indirect control-flow transfers, and weaken the existing CFI systems actually. It is an unsolved problem to extract CFGs precisely and perfectly from binaries now. In this paper, we present an algorithm to get a more precise CFG from binaries. Parameters are analyzed at indirect call-sites and functions firstly. By comparing counts of parameters prepared before call-sites and consumed by functions, targets of indirect calls are reduced. Then the control flow would be more constrained at indirect call-sites in runtime. Combined with CCFI, we implement our policy. Experimental results on some popular programs show that our approach is efficient. Further analysis show that it can mitigate COOP and other advanced attacks.

Keywords: contex-sensitive, CFI, binary analysis, code reuse attack

Procedia PDF Downloads 321
1543 F-VarNet: Fast Variational Network for MRI Reconstruction

Authors: Omer Cahana, Maya Herman, Ofer Levi

Abstract:

Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.

Keywords: MRI, deep learning, variational network, computer vision, compress sensing

Procedia PDF Downloads 158
1542 Case Study of Sexual Violence Victim Assessment in Semarang Regency

Authors: Sujana T, Kurniasari MD, Ayakeding AM

Abstract:

Background: Sexual violence is one of the violence with high incidence in Indonesia. Purpose: This research aims to describe the implementation of sexual violence victim assessment in Semarang Regency. Method: This research is a qualitative research with embeded single case study design. Data is analized with two units of analysis. The first unit of analysis is victim’s examiner with minimum one year of work experience. Semi-structured interview method is used to obtain the data. The second unit of analysis is document related. The data is taken by observing the pathway and description of every document and how it supported each implementation of assessment. Results: This study is resulted with three themes, which are: The first theme is assessments of sexual violence in Semarang regency has been standardized. The laws of the Republic of Indonesia have regulated the handling of victims of sexual violence in outline. Victims of sexual violence can be dealt with by the police, the Integrated Service Center for Women and Children Empowerment and the Regional General Hospital. Each examination site has different operational procedures standards for dealing with victims of sexual violence. Cooperation with family and witnesses is also required in the review process to obtain accurate results and evidence; The second idea that resulted from this study is there are inhibits factors in the assessments process. Victims sometimes feel embarrassed and reluctant to recount the chronological events during reporting. The examining officer should be able to approach and build a trust to convince the victim to be able to cooperate. The third theme is there are other things to consider in the process of assessing victims of sexual violence. Ensuring implementation in accordance with applicable operational procedures standards, providing exclusive examination rooms, counseling and safeguarding the privacy of victims are important to be considered in the assessment.

Keywords: assessment, case study, Semarang regency, sexual violence

Procedia PDF Downloads 138
1541 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 71
1540 Designing Information Systems in Education as Prerequisite for Successful Management Results

Authors: Vladimir Simovic, Matija Varga, Tonco Marusic

Abstract:

This research paper shows matrix technology models and examples of information systems in education (in the Republic of Croatia and in the Germany) in support of business, education (when learning and teaching) and e-learning. Here we researched and described the aims and objectives of the main process in education and technology, with main matrix classes of data. In this paper, we have example of matrix technology with detailed description of processes related to specific data classes in the processes of education and an example module that is support for the process: ‘Filling in the directory and the diary of work’ and ‘evaluation’. Also, on the lower level of the processes, we researched and described all activities which take place within the lower process in education. We researched and described the characteristics and functioning of modules: ‘Fill the directory and the diary of work’ and ‘evaluation’. For the analysis of the affinity between the aforementioned processes and/or sub-process we used our application model created in Visual Basic, which was based on the algorithm for analyzing the affinity between the observed processes and/or sub-processes.

Keywords: designing, education management, information systems, matrix technology, process affinity

Procedia PDF Downloads 437
1539 Visual and Chemical Servoing of a Hexapod Robot in a Confined Environment Using Jacobian Estimator

Authors: Guillaume Morin-Duponchelle, Ahmed Nait Chabane, Benoit Zerr, Pierre Schoesetters

Abstract:

Industrial inspection can be achieved through robotic systems, allowing visual and chemical servoing. A popular scheme for visual servo-controlled robotic is the image-based servoing sys-tems. In this paper, an approach of visual and chemical servoing of a hexapod robot using a visual and chemical Jacobian matrix are proposed. The basic idea behind the visual Jacobian matrix is modeling the differential relationship between the camera system and the robotic control system to detect and track accurately points of interest in confined environments. This approach allows the robot to easily detect and navigates to the QR code or seeks a gas source localization using surge cast algorithm. To track the QR code target, a visual servoing based on Jacobian matrix is used. For chemical servoing, three gas sensors are embedded on the hexapod. A Jacobian matrix applied to the gas concentration measurements allows estimating the direction of the main gas source. The effectiveness of the proposed scheme is first demonstrated on simulation. Finally, a hexapod prototype is designed and built and the experimental validation of the approach is presented and discussed.

Keywords: chemical servoing, hexapod robot, Jacobian matrix, visual servoing, navigation

Procedia PDF Downloads 124
1538 Content Based Video Retrieval System Using Principal Object Analysis

Authors: Van Thinh Bui, Anh Tuan Tran, Quoc Viet Ngo, The Bao Pham

Abstract:

Video retrieval is a searching problem on videos or clips based on content in which they are relatively close to an input image or video. The application of this retrieval consists of selecting video in a folder or recognizing a human in security camera. However, some recent approaches have been in challenging problem due to the diversity of video types, frame transitions and camera positions. Besides, that an appropriate measures is selected for the problem is a question. In order to overcome all obstacles, we propose a content-based video retrieval system in some main steps resulting in a good performance. From a main video, we process extracting keyframes and principal objects using Segmentation of Aggregating Superpixels (SAS) algorithm. After that, Speeded Up Robust Features (SURF) are selected from those principal objects. Then, the model “Bag-of-words” in accompanied by SVM classification are applied to obtain the retrieval result. Our system is performed on over 300 videos in diversity from music, history, movie, sports, and natural scene to TV program show. The performance is evaluated in promising comparison to the other approaches.

Keywords: video retrieval, principal objects, keyframe, segmentation of aggregating superpixels, speeded up robust features, bag-of-words, SVM

Procedia PDF Downloads 300
1537 A Study of ZY3 Satellite Digital Elevation Model Verification and Refinement with Shuttle Radar Topography Mission

Authors: Bo Wang

Abstract:

As the first high-resolution civil optical satellite, ZY-3 satellite is able to obtain high-resolution multi-view images with three linear array sensors. The images can be used to generate Digital Elevation Models (DEM) through dense matching of stereo images. However, due to the clouds, forest, water and buildings covered on the images, there are some problems in the dense matching results such as outliers and areas failed to be matched (matching holes). This paper introduced an algorithm to verify the accuracy of DEM that generated by ZY-3 satellite with Shuttle Radar Topography Mission (SRTM). Since the accuracy of SRTM (Internal accuracy: 5 m; External accuracy: 15 m) is relatively uniform in the worldwide, it may be used to improve the accuracy of ZY-3 DEM. Based on the analysis of mass DEM and SRTM data, the processing can be divided into two aspects. The registration of ZY-3 DEM and SRTM can be firstly performed using the conjugate line features and area features matched between these two datasets. Then the ZY-3 DEM can be refined by eliminating the matching outliers and filling the matching holes. The matching outliers can be eliminated based on the statistics on Local Vector Binning (LVB). The matching holes can be filled by the elevation interpolated from SRTM. Some works are also conducted for the accuracy statistics of the ZY-3 DEM.

Keywords: ZY-3 satellite imagery, DEM, SRTM, refinement

Procedia PDF Downloads 340
1536 Role of Microplastics on Reducing Heavy Metal Pollution from Wastewater

Authors: Derin Ureten

Abstract:

Plastic pollution does not disappear, it gets smaller and smaller through photolysis which are caused mainly by sun’s radiation, thermal oxidation, thermal degradation, and biodegradation which is the action of organisms digesting larger plastics. All plastic pollutants have exceedingly harmful effects on the environment. Together with the COVID-19 pandemic, the number of plastic products such as masks and gloves flowing into the environment has increased more than ever. However, microplastics are not the only pollutants in water, one of the most tenacious and toxic pollutants are heavy metals. Heavy metal solutions are also capable of causing varieties of health problems in organisms such as cancer, organ damage, nervous system damage, and even death. The aim of this research is to prove that microplastics can be used in wastewater treatment systems by proving that they could adsorb heavy metals in solutions. Experiment for this research will include two heavy metal solutions; one including microplastics in a heavy metal contaminated water solution, and one that just includes heavy metal solution. After being sieved, absorbance of both mediums will be measured with the help of a spectrometer. Iron (III) chloride (FeCl3) will be used as the heavy metal solution since the solution becomes darker as the presence of this substance increases. The experiment will be supported by Pure Nile Red powder in order to observe if there are any visible differences under the microscope. Pure Nile Red powder is a chemical that binds to hydrophobic materials such as plastics and lipids. If proof of adsorbance could be observed by the rates of the solutions' final absorbance rates and visuals ensured by the Pure Nile Red powder, the experiment will be conducted with different temperature levels in order to analyze the most accurate temperature level to proceed with removal of heavy metals from water. New wastewater treatment systems could be generated with the help of microplastics, for water contaminated with heavy metals.

Keywords: microplastics, heavy metal, pollution, adsorbance, wastewater treatment

Procedia PDF Downloads 85
1535 Unknown Groundwater Pollution Source Characterization in Contaminated Mine Sites Using Optimal Monitoring Network Design

Authors: H. K. Esfahani, B. Datta

Abstract:

Groundwater is one of the most important natural resources in many parts of the world; however it is widely polluted due to human activities. Currently, effective and reliable groundwater management and remediation strategies are obtained using characterization of groundwater pollution sources, where the measured data in monitoring locations are utilized to estimate the unknown pollutant source location and magnitude. However, accurately identifying characteristics of contaminant sources is a challenging task due to uncertainties in terms of predicting source flux injection, hydro-geological and geo-chemical parameters, and the concentration field measurement. Reactive transport of chemical species in contaminated groundwater systems, especially with multiple species, is a complex and highly non-linear geochemical process. Although sufficient concentration measurement data is essential to accurately identify sources characteristics, available data are often sparse and limited in quantity. Therefore, this inverse problem-solving method for characterizing unknown groundwater pollution sources is often considered ill-posed, complex and non- unique. Different methods have been utilized to identify pollution sources; however, the linked simulation-optimization approach is one effective method to obtain acceptable results under uncertainties in complex real life scenarios. With this approach, the numerical flow and contaminant transport simulation models are externally linked to an optimization algorithm, with the objective of minimizing the difference between measured concentration and estimated pollutant concentration at observation locations. Concentration measurement data are very important to accurately estimate pollution source properties; therefore, optimal design of the monitoring network is essential to gather adequate measured data at desired times and locations. Due to budget and physical restrictions, an efficient and effective approach for groundwater pollutant source characterization is to design an optimal monitoring network, especially when only inadequate and arbitrary concentration measurement data are initially available. In this approach, preliminary concentration observation data are utilized for preliminary source location, magnitude and duration of source activity identification, and these results are utilized for monitoring network design. Further, feedback information from the monitoring network is used as inputs for sequential monitoring network design, to improve the identification of unknown source characteristics. To design an effective monitoring network of observation wells, optimization and interpolation techniques are used. A simulation model should be utilized to accurately describe the aquifer properties in terms of hydro-geochemical parameters and boundary conditions. However, the simulation of the transport processes becomes complex when the pollutants are chemically reactive. Three dimensional transient flow and reactive contaminant transport process is considered. The proposed methodology uses HYDROGEOCHEM 5.0 (HGCH) as the simulation model for flow and transport processes with chemically multiple reactive species. Adaptive Simulated Annealing (ASA) is used as optimization algorithm in linked simulation-optimization methodology to identify the unknown source characteristics. Therefore, the aim of the present study is to develop a methodology to optimally design an effective monitoring network for pollution source characterization with reactive species in polluted aquifers. The performance of the developed methodology will be evaluated for an illustrative polluted aquifer sites, for example an abandoned mine site in Queensland, Australia.

Keywords: monitoring network design, source characterization, chemical reactive transport process, contaminated mine site

Procedia PDF Downloads 230
1534 DMBR-Net: Deep Multiple-Resolution Bilateral Networks for Real-Time and Accurate Semantic Segmentation

Authors: Pengfei Meng, Shuangcheng Jia, Qian Li

Abstract:

We proposed a real-time high-precision semantic segmentation network based on a multi-resolution feature fusion module, the auxiliary feature extracting module, upsampling module, and atrous spatial pyramid pooling (ASPP) module. We designed a feature fusion structure, which is integrated with sufficient features of different resolutions. We also studied the effect of side-branch structure on the network and made discoveries. Based on the discoveries about the side-branch of the network structure, we used a side-branch auxiliary feature extraction layer in the network to improve the effectiveness of the network. We also designed upsampling module, which has better results than the original upsampling module. In addition, we also re-considered the locations and number of atrous spatial pyramid pooling (ASPP) modules and modified the network structure according to the experimental results to further improve the effectiveness of the network. The network presented in this paper takes the backbone network of Bisenetv2 as a basic network, based on which we constructed a network structure on which we made improvements. We named this network deep multiple-resolution bilateral networks for real-time, referred to as DMBR-Net. After experimental testing, our proposed DMBR-Net network achieved 81.2% mIoU at 119FPS on the Cityscapes validation dataset, 80.7% mIoU at 109FPS on the CamVid test dataset, 29.9% mIoU at 78FPS on the COCOStuff test dataset. Compared with all lightweight real-time semantic segmentation networks, our network achieves the highest accuracy at an appropriate speed.

Keywords: multi-resolution feature fusion, atrous convolutional, bilateral networks, pyramid pooling

Procedia PDF Downloads 149
1533 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context

Authors: Mohamed Boullouz, Mohamed Louay Metougui

Abstract:

Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.

Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems

Procedia PDF Downloads 63
1532 Spectral Mixture Model Applied to Cannabis Parcel Determination

Authors: Levent Basayigit, Sinan Demir, Yusuf Ucar, Burhan Kara

Abstract:

Many research projects require accurate delineation of the different land cover type of the agricultural area. Especially it is critically important for the definition of specific plants like cannabis. However, the complexity of vegetation stands structure, abundant vegetation species, and the smooth transition between different seconder section stages make vegetation classification difficult when using traditional approaches such as the maximum likelihood classifier. Most of the time, classification distinguishes only between trees/annual or grain. It has been difficult to accurately determine the cannabis mixed with other plants. In this paper, a mixed distribution models approach is applied to classify pure and mix cannabis parcels using Worldview-2 imagery in the Lakes region of Turkey. Five different land use types (i.e. sunflower, maize, bare soil, and cannabis) were identified in the image. A constrained Gaussian mixture discriminant analysis (GMDA) was used to unmix the image. In the study, 255 reflectance ratios derived from spectral signatures of seven bands (Blue-Green-Yellow-Red-Rededge-NIR1-NIR2) were randomly arranged as 80% for training and 20% for test data. Gaussian mixed distribution model approach is proved to be an effective and convenient way to combine very high spatial resolution imagery for distinguishing cannabis vegetation. Based on the overall accuracies of the classification, the Gaussian mixed distribution model was found to be very successful to achieve image classification tasks. This approach is sensitive to capture the illegal cannabis planting areas in the large plain. This approach can also be used for monitoring and determination with spectral reflections in illegal cannabis planting areas.

Keywords: Gaussian mixture discriminant analysis, spectral mixture model, Worldview-2, land parcels

Procedia PDF Downloads 194
1531 Helicobacter Pylori Detection by Invasive and Noninvasive Diagnostic Tests from Dyspepsia Patients

Authors: Muhammad Suhail Ibrahim, Ahmad Mujtaba

Abstract:

Background: The accuracy of the most frequently used tests for diagnosing Helicobacter pylori is always under consideration in clinical settings. A reliable diagnosis is crucial to confirm the success of therapy. Objective: The aim of this research was to study the isolation frequency of H. pylori from patients compatible with gastritis or gastric ulcer and to compare some feasible non-invasive and invasive methods for the diagnosis of infection. Materials and Methods: Ninety-six gastric biopsy and blood samples were obtained with various gastroduodenal symptoms after obtaining informed consent. The biopsies were analyzed and compared using the culture, microscopic examination, histopathology, Rapid urease RUT), serology, biochemical, antibiotic susceptibility test and molecular method. Results: A number of 40 (41.67%) were considered H. pylori positive in both histopathology and RUT. On the other hand, 46 patients were positive against anti IgA and IgG by ELISA. Eighteen biopsies were positive according to the culture test. This was further confirmed by endoscopic examination, urease, catalase and oxidase tests. A high percentage of resistance to polymyxin B, amoxicillin, and kanamycin was observed (100, 88.89, and 77.78%, respectively). A gene (Cag A) was also detected by using molecular technique which appeared positive in 16 patients. The sensitivity/specificity (%) of diagnostic method was 95/77 for histology, 100/83.5 for rapid urease, 85.7/90 for gram staining, 100/66.6 for IgG serology, 100/79.5 for IgA serology, 100/75.0 for PCR, 100/79.04 for combination of RUT and IgG serology and 100/92.4 for combination of RUT, gram staining and IgG serology. Conclusion: In view of the result obtained, PCR appeared to be the most reliable test. However, higher sensitivity and specificity were also recorded for other tests. So, for more accurate results, it is advisable not to rely solely on a single method for detection.

Keywords: helicobacter pylori, isolation, detection, culture, urease, polymerase chain reaction, antibiotic susceptibility test, dyspeptic patients

Procedia PDF Downloads 66
1530 Comparison of Agree Method and Shortest Path Method for Determining the Flow Direction in Basin Morphometric Analysis: Case Study of Lower Tapi Basin, Western India

Authors: Jaypalsinh Parmar, Pintu Nakrani, Bhaumik Shah

Abstract:

Digital Elevation Model (DEM) is elevation data of the virtual grid on the ground. DEM can be used in application in GIS such as hydrological modelling, flood forecasting, morphometrical analysis and surveying etc.. For morphometrical analysis the stream flow network plays a very important role. DEM lacks accuracy and cannot match field data as it should for accurate results of morphometrical analysis. The present study focuses on comparing the Agree method and the conventional Shortest path method for finding out morphometric parameters in the flat region of the Lower Tapi Basin which is located in the western India. For the present study, open source SRTM (Shuttle Radar Topography Mission with 1 arc resolution) and toposheets issued by Survey of India (SOI) were used to determine the morphometric linear aspect such as stream order, number of stream, stream length, bifurcation ratio, mean stream length, mean bifurcation ratio, stream length ratio, length of overland flow, constant of channel maintenance and aerial aspect such as drainage density, stream frequency, drainage texture, form factor, circularity ratio, elongation ratio, shape factor and relief aspect such as relief ratio, gradient ratio and basin relief for 53 catchments of Lower Tapi Basin. Stream network was digitized from the available toposheets. Agree DEM was created by using the SRTM and stream network from the toposheets. The results obtained were used to demonstrate a comparison between the two methods in the flat areas.

Keywords: agree method, morphometric analysis, lower Tapi basin, shortest path method

Procedia PDF Downloads 237
1529 Comparison of the Anthropometric Obesity Indices in Prediction of Cardiovascular Disease Risk: Systematic Review and Meta-analysis

Authors: Saeed Pourhassan, Nastaran Maghbouli

Abstract:

Statement of the problem: The relationship between obesity and cardiovascular diseases has been studied widely(1). The distribution of fat tissue gained attention in relation to cardiovascular risk factors during lang-time research (2). American College of Cardiology/American Heart Association (ACC/AHA) is widely and the most reliable tool to be used as a cardiovascular risk (CVR) assessment tool(3). This study aimed to determine which anthropometric index is better in discrimination of high CVR patients from low risks using ACC/AHA score in addition to finding the best index as a CVR predictor among both genders in different races and countries. Methodology & theoretical orientation: The literature in PubMed, Scopus, Embase, Web of Science, and Google Scholar were searched by two independent investigators using the keywords "anthropometric indices," "cardiovascular risk," and "obesity." The search strategy was limited to studies published prior to Jan 2022 as full-texts in the English language. Studies using ACC/AHA risk assessment tool as CVR and those consisted at least 2 anthropometric indices (ancient ones and novel ones) are included. Study characteristics and data were extracted. The relative risks were pooled with the use of the random-effect model. Analysis was repeated in subgroups. Findings: Pooled relative risk for 7 studies with 16,348 participants were 1.56 (1.35-1.72) for BMI, 1.67(1.36-1.83) for WC [waist circumference], 1.72 (1.54-1.89) for WHR [waist-to-hip ratio], 1.60 (1.44-1.78) for WHtR [waist-to-height ratio], 1.61 (1.37-1.82) for ABSI [A body shape index] and 1.63 (1.32-1.89) for CI [Conicity index]. Considering gender, WC among females and WHR among men gained the highest RR. The heterogeneity of studies was moderate (α²: 56%), which was not decreased by subgroup analysis. Some indices such as VAI and LAP were evaluated just in one study. Conclusion & significance: This meta-analysis showed WHR could predict CVR better in comparison to BMI or WHtR. Some new indices like CI and ABSI are less accurate than WHR and WC. Among women, WC seems to be a better choice to predict cardiovascular disease risk.

Keywords: obesity, cardiovascular disease, risk assessment, anthropometric indices

Procedia PDF Downloads 101
1528 An Audit on the Quality of Pre-Operative Intra-Oral Digital Radiographs Taken for Dental Extractions in a General Practice Setting

Authors: Gabrielle O'Donoghue

Abstract:

Background: Pre-operative radiographs facilitate assessment and treatment planning in minor oral surgery. Quality assurance for dental radiography advocates the As Low As Reasonably Achievable (ALARA) principle in collecting accurate diagnostic information. Aims: To audit the quality of digital intraoral periapicals (IOPAs) taken prior to dental extractions in a metropolitan general dental practice setting. Standards: The National Radiological Protection Board (NRPB) guidance outlines three grades of radiograph quality: excellent (Grade 1 > 70% of total exposures), diagnostically acceptable (Grade 2 <20%), and unacceptable (Grade 3 <10%). Methodology: A study of pre-operative radiographs taken prior to dental extractions across 12 private general dental practices in a large metropolitan area by 44 practitioners. A total of 725 extractions were assessed, allowing 258 IOPAs to be reviewed in one audit cycle. Results: First cycle: Of 258 IOPAs: 223(86.4%) scored Grade 1, 27(10.5%) Grade 2, and 8(3.1%) Grade 3. The standard was met. 35 dental extractions were performed without an available pre-operative radiograph. Action Plan & Recommendations: Results were distributed to all staff and a continuous professional development evening organized to outline recommendations to improve image quality. A second audit cycle is proposed at a six-month interval to review the recommendations and appraise results. Conclusion: The overall standard of radiographs met the published guidelines. A significant improvement in the number of procedures undertaken without pre-operative imaging is expected at a six-month interval period. An investigation into undiagnostic imaging and associated adverse patient outcomes is being considered. Maintenance of the standards achieved is predicted in the second audit cycle to ensure consistent high quality imaging.

Keywords: audit, oral radiology, oral surgery, periapical radiographs, quality assurance

Procedia PDF Downloads 163
1527 A Case Study of Deep Learning for Disease Detection in Crops

Authors: Felipe A. Guth, Shane Ward, Kevin McDonnell

Abstract:

In the precision agriculture area, one of the main tasks is the automated detection of diseases in crops. Machine Learning algorithms have been studied in recent decades for such tasks in view of their potential for improving economic outcomes that automated disease detection may attain over crop fields. The latest generation of deep learning convolution neural networks has presented significant results in the area of image classification. In this way, this work has tested the implementation of an architecture of deep learning convolution neural network for the detection of diseases in different types of crops. A data augmentation strategy was used to meet the requirements of the algorithm implemented with a deep learning framework. Two test scenarios were deployed. The first scenario implemented a neural network under images extracted from a controlled environment while the second one took images both from the field and the controlled environment. The results evaluated the generalisation capacity of the neural networks in relation to the two types of images presented. Results yielded a general classification accuracy of 59% in scenario 1 and 96% in scenario 2.

Keywords: convolutional neural networks, deep learning, disease detection, precision agriculture

Procedia PDF Downloads 257
1526 Improved Blood Glucose-Insulin Monitoring with Dual-Layer Predictive Control Design

Authors: Vahid Nademi

Abstract:

In response to widely used wearable medical devices equipped with a continuous glucose monitor (CGM) and insulin pump, the advanced control methods are still demanding to get the full benefit of these devices. Unlike costly clinical trials, implementing effective insulin-glucose control strategies can provide significant contributions to the patients suffering from chronic diseases such as diabetes. This study deals with a key role of two-layer insulin-glucose regulator based on model-predictive-control (MPC) scheme so that the patient’s predicted glucose profile is in compliance with the insulin level injected through insulin pump automatically. It is achieved by iterative optimization algorithm which is called an integrated perturbation analysis and sequential quadratic programming (IPA-SQP) solver for handling uncertainties due to unexpected variations in glucose-insulin values and body’s characteristics. The feasibility evaluation of the discussed control approach is also studied by means of numerical simulations of two case scenarios via measured data. The obtained results are presented to verify the superior and reliable performance of the proposed control scheme with no negative impact on patient safety.

Keywords: blood glucose monitoring, insulin pump, predictive control, optimization

Procedia PDF Downloads 134
1525 Truck Scheduling Problem in a Cross-Dock Centre with Fixed Due Dates

Authors: Mohsen S. Sajadieha, Danyar Molavia

Abstract:

In this paper, a truck scheduling problem is investigated at a two-touch cross-docking center with due dates for outbound trucks as a hard constraint. The objective is to minimize the total cost comprising penalty and delivery cost of delayed shipments. The sequence of unloading shipments is considered and is assumed that shipments are sent to shipping dock doors immediately after unloading and a First-In-First-Out (FIFO) policy is considered for loading the shipments. A mixed integer programming model is developed for the proposed model. Two meta-heuristic algorithms including genetic algorithm (GA) and variable neighborhood search (VNS) are developed to solve the problem in medium and large sized scales. The numerical results show that increase in due dates for outbound trucks has a crucial impact on the reduction of penalty costs of delayed shipments. In addition, by increase the due dates, the improvement in the objective function arises on average in comparison with the situation that the cross-dock is multi-touch and shipments are sent to shipping dock doors only after unloading the whole inbound truck.

Keywords: cross-docking, truck scheduling, fixed due date, door assignment

Procedia PDF Downloads 402
1524 Assessment of Kinetic Trajectory of the Median Nerve from Wrist Ultrasound Images Using Two Dimensional Baysian Speckle Tracking Technique

Authors: Li-Kai Kuo, Shyh-Hau Wang

Abstract:

The kinetic trajectory of the median nerve (MN) in the wrist has shown to be capable of being applied to assess the carpal tunnel syndrome (CTS), and was found able to be detected by high-frequency ultrasound image via motion tracking technique. Yet, previous study may not quickly perform the measurement due to the use of a single element transducer for ultrasound image scanning. Therefore, previous system is not appropriate for being applied to clinical application. In the present study, B-mode ultrasound images of the wrist corresponding to movements of fingers from flexion to extension were acquired by clinical applicable real-time scanner. The kinetic trajectories of MN were off-line estimated utilizing two dimensional Baysian speckle tracking (TDBST) technique. The experiments were carried out from ten volunteers by ultrasound scanner at 12 MHz frequency. Results verified from phantom experiments have demonstrated that TDBST technique is able to detect the movement of MN based on signals of the past and present information and then to reduce the computational complications associated with the effect of such image quality as the resolution and contrast variations. Moreover, TDBST technique tended to be more accurate than that of the normalized cross correlation tracking (NCCT) technique used in previous study to detect movements of the MN in the wrist. In response to fingers’ flexion movement, the kinetic trajectory of the MN moved toward the ulnar-palmar direction, and then toward the radial-dorsal direction corresponding to the extensional movement. TDBST technique and the employed ultrasound image scanner have verified to be feasible to sensitively detect the kinetic trajectory and displacement of the MN. It thus could be further applied to diagnose CTS clinically and to improve the measurements to assess 3D trajectory of the MN.

Keywords: baysian speckle tracking, carpal tunnel syndrome, median nerve, motion tracking

Procedia PDF Downloads 494
1523 Evaluation of Features Extraction Algorithms for a Real-Time Isolated Word Recognition System

Authors: Tomyslav Sledevič, Artūras Serackis, Gintautas Tamulevičius, Dalius Navakauskas

Abstract:

This paper presents a comparative evaluation of features extraction algorithm for a real-time isolated word recognition system based on FPGA. The Mel-frequency cepstral, linear frequency cepstral, linear predictive and their cepstral coefficients were implemented in hardware/software design. The proposed system was investigated in the speaker-dependent mode for 100 different Lithuanian words. The robustness of features extraction algorithms was tested recognizing the speech records at different signals to noise rates. The experiments on clean records show highest accuracy for Mel-frequency cepstral and linear frequency cepstral coefficients. For records with 15 dB signal to noise rate the linear predictive cepstral coefficients give best result. The hard and soft part of the system is clocked on 50 MHz and 100 MHz accordingly. For the classification purpose, the pipelined dynamic time warping core was implemented. The proposed word recognition system satisfies the real-time requirements and is suitable for applications in embedded systems.

Keywords: isolated word recognition, features extraction, MFCC, LFCC, LPCC, LPC, FPGA, DTW

Procedia PDF Downloads 493
1522 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall

Authors: Sanjib Kr Pal, S. Bhattacharyya

Abstract:

Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.

Keywords: conjugate heat transfer, mixed convection, nano fluid, wall waviness

Procedia PDF Downloads 253
1521 Vehicle Activity Characterization Approach to Quantify On-Road Mobile Source Emissions

Authors: Hatem Abou-Senna, Essam Radwan

Abstract:

Transportation agencies and researchers in the past have estimated emissions using one average speed and volume on a long stretch of roadway. Other methods provided better accuracy utilizing annual average estimates. Travel demand models provided an intermediate level of detail through average daily volumes. Currently, higher accuracy can be established utilizing microscopic analyses by splitting the network links into sub-links and utilizing second-by-second trajectories to calculate emissions. The need to accurately quantify transportation-related emissions from vehicles is essential. This paper presents an examination of four different approaches to capture the environmental impacts of vehicular operations on a 10-mile stretch of Interstate 4 (I-4), an urban limited access highway in Orlando, Florida. First, (at the most basic level), emissions were estimated for the entire 10-mile section 'by hand' using one average traffic volume and average speed. Then, three advanced levels of detail were studied using VISSIM/MOVES to analyze smaller links: average speeds and volumes (AVG), second-by-second link drive schedules (LDS), and second-by-second operating mode distributions (OPMODE). This paper analyzes how the various approaches affect predicted emissions of CO, NOx, PM2.5, PM10, and CO2. The results demonstrate that obtaining precise and comprehensive operating mode distributions on a second-by-second basis provides more accurate emission estimates. Specifically, emission rates are highly sensitive to stop-and-go traffic and the associated driving cycles of acceleration, deceleration, and idling. Using the AVG or LDS approach may overestimate or underestimate emissions, respectively, compared to an operating mode distribution approach.

Keywords: limited access highways, MOVES, operating mode distribution (OPMODE), transportation emissions, vehicle specific power (VSP)

Procedia PDF Downloads 338