Search results for: end-user trained information extraction
12417 Pharmacognostic, Phytochemical and Antibacterial Activity of Beaumontia Randiflora
Authors: Narmeen Mehmood
Abstract:
The current study was conducted to evaluate the pharmacognostic parameters, phytochemical analysis and antibacterial activity of the plant. Microscopic studies were carried out to determine various Pharmacognostic parameters. Section cutting of the leaf was also done. The study of the ariel parts of Beaumontia grandiflora resulted in the identification of fatty acids mixture and unsaponifiable matters. For the separation of various constituents of the plant, successive solvent extraction was carried out in a laboratory. Material and Methods: The study was carried out with all three extracts of Beaumontia grandiflora i.e. Petroleum ether, Chloroform and Methanol. For the separation of various constituents of the plant, successive solvent extraction was carried out in the laboratory. Raw data containing the measured zones of inhibition in mm was tabulated. Results: The microscopic studies showed the presence of Upper epidermis in surface view, Part of Lamina in section view, cortical parenchyma in longitudinal view, Parenchyma with collapsed tissues, Parenchyma Cells, Epidermal cells with a part of covering trichome, starch granules, reticulated thickened vessels, Transverse Section of leaf of Beaumontia grandiflora showed Upper Epidermis, Lower Epidermis, Hairs, Vascular Bundles, Parenchyma. Phytochemical analysis of leaves of Beaumontia grandiflora indicates that Alkaloids are present. There is a possibility of the presence of some bioactive components in the crude extracts due to which it shows strong activity. Petroleum ether extract shows a greater zone of inhibition at low concentrations. Conclusion: The alkaloids possess good antibacterial activity so the presence of alkaloids may be responsible for the antibacterial activity observed in the crude organic extract of Beaumontia grandiflora.Keywords: successive solvent extraction, zone of inhibitions., microscopy, phytochemical analysis
Procedia PDF Downloads 2112416 High-Capacity Image Steganography using Wavelet-based Fusion on Deep Convolutional Neural Networks
Authors: Amal Khalifa, Nicolas Vana Santos
Abstract:
Steganography has been known for centuries as an efficient approach for covert communication. Due to its popularity and ease of access, image steganography has attracted researchers to find secure techniques for hiding information within an innocent looking cover image. In this research, we propose a novel deep-learning approach to digital image steganography. The proposed method, DeepWaveletFusion, uses convolutional neural networks (CNN) to hide a secret image into a cover image of the same size. Two CNNs are trained back-to-back to merge the Discrete Wavelet Transform (DWT) of both colored images and eventually be able to blindly extract the hidden image. Based on two different image similarity metrics, a weighted gain function is used to guide the learning process and maximize the quality of the retrieved secret image and yet maintaining acceptable imperceptibility. Experimental results verified the high recoverability of DeepWaveletFusion which outperformed similar deep-learning-based methods.Keywords: deep learning, steganography, image, discrete wavelet transform, fusion
Procedia PDF Downloads 9012415 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM
Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad
Abstract:
Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet
Procedia PDF Downloads 33212414 Mitigating the Unwillingness of e-Forums Members to Engage in Information Exchange
Authors: Dora Triki, Irena Vida, Claude Obadia
Abstract:
Social networks such as e-Forums or dating sites often face the reluctance of key members to participate. Relying on the conation theory, this study investigates this phenomenon and proposes solutions to mitigate the issue. We show that highly experienced e-Forum members refuse to share business information in a peer to peer information exchange forums. However, forums managers can mitigate this behavior by developing a sentiment of belongingness to the network. Furthermore, by selecting only elite forum participants with ample experience, they can reduce the reluctance of key information providers to engage in information exchange. Our hypotheses are tested with PLS structural equations modeling using survey data from members of a French e-Forum dedicated to the exchange of business information about exporting.Keywords: conation, e-Forum, information exchange, members participation
Procedia PDF Downloads 15812413 The Design of a Mixed Matrix Model for Activity Levels Extraction and Sub Processes Classification of a Work Project (Case: Great Tehran Electrical Distribution Company)
Authors: Elham Allahmoradi, Bahman Allahmoradi, Ali Bonyadi Naeini
Abstract:
Complex systems have many aspects. A variety of methods have been developed to analyze these systems. The most efficient of these methods should not only be simple, but also provide useful and comprehensive information about many aspects of the system. Matrix methods are considered the most commonly methods used to analyze and design systems. Each matrix method can examine a particular aspect of the system. If these methods are combined, managers can access to more comprehensive and broader information about the system. This study was conducted in four steps. In the first step, a process model of a real project has been extracted through IDEF3. In the second step, activity levels have been attained by writing a process model in the form of a design structure matrix (DSM) and sorting it through triangulation algorithm (TA). In the third step, sub-processes have been obtained by writing the process model in the form of an interface structure matrix (ISM) and clustering it through cluster identification algorithm (CIA). In the fourth step, a mixed model has been developed to provide a unified picture of the project structure through the simultaneous presentation of activities and sub-processes. Finally, the paper is completed with a conclusion.Keywords: integrated definition for process description capture (IDEF3) method, design structure matrix (DSM), interface structure matrix (ism), mixed matrix model, activity level, sub-process
Procedia PDF Downloads 49412412 Automated Weight Painting: Using Deep Neural Networks to Adjust 3D Mesh Skeletal Weights
Authors: John Gibbs, Benjamin Flanders, Dylan Pozorski, Weixuan Liu
Abstract:
Weight Painting–adjusting the influence a skeletal joint has on a given vertex in a character mesh–is an arduous and time con- suming part of the 3D animation pipeline. This process generally requires a trained technical animator and many hours of work to complete. Our skiNNer plug-in, which works within Autodesk’s Maya 3D animation software, uses Machine Learning and data pro- cessing techniques to create a deep neural network model that can accomplish the weight painting task in seconds rather than hours for bipedal quasi-humanoid character meshes. In order to create a properly trained network, a number of challenges were overcome, including curating an appropriately large data library, managing an arbitrary 3D mesh size, handling arbitrary skeletal architectures, accounting for extreme numeric values (most data points are near 0 or 1 for weight maps), and constructing an appropriate neural network model that can properly capture the high frequency alter- ation between high weight values (near 1.0) and low weight values (near 0.0). The arrived at neural network model is a cross between a traditional CNN, deep residual network, and fully dense network. The resultant network captures the unusually hard-edged features of a weight map matrix, and produces excellent results on many bipedal models.Keywords: 3d animation, animation, character, rigging, skinning, weight painting, machine learning, artificial intelligence, neural network, deep neural network
Procedia PDF Downloads 27312411 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images
Authors: Masood Varshosaz, Kamyar Hasanpour
Abstract:
In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.Keywords: human recognition, deep learning, drones, disaster mitigation
Procedia PDF Downloads 9412410 Developing a Systems Dynamics Model for Security Management
Authors: Kuan-Chou Chen
Abstract:
This paper will demonstrate a simulation model of an information security system by using the systems dynamic approach. The relationships in the system model are designed to be simple and functional and do not necessarily represent any particular information security environments. The purpose of the paper aims to develop a generic system dynamic information security system model with implications on information security research. The interrelated and interdependent relationships of five primary sectors in the system dynamic model will be presented in this paper. The integrated information security systems model will include (1) information security characteristics, (2) users, (3) technology, (4) business functions, and (5) policy and management. Environments, attacks, government and social culture will be defined as the external sector. The interactions within each of these sectors will be depicted by system loop map as well. The proposed system dynamic model will not only provide a conceptual framework for information security analysts and designers but also allow information security managers to remove the incongruity between the management of risk incidents and the management of knowledge and further support information security managers and decision makers the foundation for managerial actions and policy decisions.Keywords: system thinking, information security systems, security management, simulation
Procedia PDF Downloads 42912409 Utilization of Chrysanthemum Flowers in Textile Dyeing: Chemical and Phenolic Analysis of Dyes and Fabrics
Authors: Muhammad Ahmad
Abstract:
In this research, Chrysanthemum (morifolium) flowers are used as a natural dye to reduce synthetic dyes and take a step toward sustainability in the fashion industry. The aqueous extraction method is utilized for natural dye extraction and then applied to silk and cotton fabric samples. The color of the dye extracted from dried chrysanthemum flowers is originally a shade of rich green, but after being washed with detergent, it turns to a shade of yellow. Traditional salt and vinegar are used as a natural mordant to fix the dye color. This study also includes a phenolic and chemical analysis of the natural dye (Chrysanthemum flowers) and the textiles (cotton and silk). Compared to cotton fabric, silk fabric has far superior chemical qualities to use in natural dyeing. The results of this study show that the Chrysanthemum flower offers a variety of colors when treated with detergent, without detergent, and with mordants. Chrysanthemum flowers have long been used in other fields, such as medicine; therefore, it is time to start using them in the fashion industry as a natural dye to lessen the harm that synthetic dyes cause.Keywords: natural dyes, Chrysanthemum flower, sustainability, textile fabrics, chemical and phenolic analysis
Procedia PDF Downloads 2012408 Game Structure and Spatio-Temporal Action Detection in Soccer Using Graphs and 3D Convolutional Networks
Authors: Jérémie Ochin
Abstract:
Soccer analytics are built on two data sources: the frame-by-frame position of each player on the terrain and the sequences of events, such as ball drive, pass, cross, shot, throw-in... With more than 2000 ball-events per soccer game, their precise and exhaustive annotation, based on a monocular video stream such as a TV broadcast, remains a tedious and costly manual task. State-of-the-art methods for spatio-temporal action detection from a monocular video stream, often based on 3D convolutional neural networks, are close to reach levels of performances in mean Average Precision (mAP) compatibles with the automation of such task. Nevertheless, to meet their expectation of exhaustiveness in the context of data analytics, such methods must be applied in a regime of high recall – low precision, using low confidence score thresholds. This setting unavoidably leads to the detection of false positives that are the product of the well documented overconfidence behaviour of neural networks and, in this case, their limited access to contextual information and understanding of the game: their predictions are highly unstructured. Based on the assumption that professional soccer players’ behaviour, pose, positions and velocity are highly interrelated and locally driven by the player performing a ball-action, it is hypothesized that the addition of information regarding surrounding player’s appearance, positions and velocity in the prediction methods can improve their metrics. Several methods are compared to build a proper representation of the game surrounding a player, from handcrafted features of the local graph, based on domain knowledge, to the use of Graph Neural Networks trained in an end-to-end fashion with existing state-of-the-art 3D convolutional neural networks. It is shown that the inclusion of information regarding surrounding players helps reaching higher metrics.Keywords: fine-grained action recognition, human action recognition, convolutional neural networks, graph neural networks, spatio-temporal action recognition
Procedia PDF Downloads 2312407 Characterizing and Developing the Clinical Grade Microbiome Assay with a Robust Bioinformatics Pipeline for Supporting Precision Medicine Driven Clinical Development
Authors: Danyi Wang, Andrew Schriefer, Dennis O'Rourke, Brajendra Kumar, Yang Liu, Fei Zhong, Juergen Scheuenpflug, Zheng Feng
Abstract:
Purpose: It has been recognized that the microbiome plays critical roles in disease pathogenesis, including cancer, autoimmune disease, and multiple sclerosis. To develop a clinical-grade assay for exploring microbiome-derived clinical biomarkers across disease areas, a two-phase approach is implemented. 1) Identification of the optimal sample preparation reagents using pre-mixed bacteria and healthy donor stool samples coupled with proprietary Sigma-Aldrich® bioinformatics solution. 2) Exploratory analysis of patient samples for enabling precision medicine. Study Procedure: In phase 1 study, we first compared the 16S sequencing results of two ATCC® microbiome standards (MSA 2002 and MSA 2003) across five different extraction kits (Kit A, B, C, D & E). Both microbiome standards samples were extracted in triplicate across all extraction kits. Following isolation, DNA quantity was determined by Qubit assay. DNA quality was assessed to determine purity and to confirm extracted DNA is of high molecular weight. Bacterial 16S ribosomal ribonucleic acid (rRNA) amplicons were generated via amplification of the V3/V4 hypervariable region of the 16S rRNA. Sequencing was performed using a 2x300 bp paired-end configuration on the Illumina MiSeq. Fastq files were analyzed using the Sigma-Aldrich® Microbiome Platform. The Microbiome Platform is a cloud-based service that offers best-in-class 16S-seq and WGS analysis pipelines and databases. The Platform and its methods have been extensively benchmarked using microbiome standards generated internally by MilliporeSigma and other external providers. Data Summary: The DNA yield using the extraction kit D and E is below the limit of detection (100 pg/µl) of Qubit assay as both extraction kits are intended for samples with low bacterial counts. The pre-mixed bacterial pellets at high concentrations with an input of 2 x106 cells for MSA-2002 and 1 x106 cells from MSA-2003 were not compatible with the kits. Among the remaining 3 extraction kits, kit A produced the greatest yield whereas kit B provided the least yield (Kit-A/MSA-2002: 174.25 ± 34.98; Kit-A/MSA-2003: 179.89 ± 30.18; Kit-B/MSA-2002: 27.86 ± 9.35; Kit-B/MSA-2003: 23.14 ± 6.39; Kit-C/MSA-2002: 55.19 ± 10.18; Kit-C/MSA-2003: 35.80 ± 11.41 (Mean ± SD)). Also, kit A produced the greatest yield, whereas kit B provided the least yield. The PCoA 3D visualization of the Weighted Unifrac beta diversity shows that kits A and C cluster closely together while kit B appears as an outlier. The kit A sequencing samples cluster more closely together than both the other kits. The taxonomic profiles of kit B have lower recall when compared to the known mixture profiles indicating that kit B was inefficient at detecting some of the bacteria. Conclusion: Our data demonstrated that the DNA extraction method impacts DNA concentration, purity, and microbial communities detected by next-generation sequencing analysis. Further microbiome analysis performance comparison of using healthy stool samples is underway; also, colorectal cancer patients' samples will be acquired for further explore the clinical utilities. Collectively, our comprehensive qualification approach, including the evaluation of optimal DNA extraction conditions, the inclusion of positive controls, and the implementation of a robust qualified bioinformatics pipeline, assures accurate characterization of the microbiota in a complex matrix for deciphering the deep biology and enabling precision medicine.Keywords: 16S rRNA sequencing, analytical validation, bioinformatics pipeline, metagenomics
Procedia PDF Downloads 17012406 Research on the United Navigation Mechanism of Land, Sea and Air Targets under Multi-Sources Information Fusion
Authors: Rui Liu, Klaus Greve
Abstract:
The navigation information is a kind of dynamic geographic information, and the navigation information system is a kind of special geographic information system. At present, there are many researches on the application of centralized management and cross-integration application of basic geographic information. However, the idea of information integration and sharing is not deeply applied into the research of navigation information service. And the imperfection of navigation target coordination and navigation information sharing mechanism under certain navigation tasks has greatly affected the reliability and scientificity of navigation service such as path planning. Considering this, the project intends to study the multi-source information fusion and multi-objective united navigation information interaction mechanism: first of all, investigate the actual needs of navigation users in different areas, and establish the preliminary navigation information classification and importance level model; and then analyze the characteristics of the remote sensing and GIS vector data, and design the fusion algorithm from the aspect of improving the positioning accuracy and extracting the navigation environment data. At last, the project intends to analyze the feature of navigation information of the land, sea and air navigation targets, and design the united navigation data standard and navigation information sharing model under certain navigation tasks, and establish a test navigation system for united navigation simulation experiment. The aim of this study is to explore the theory of united navigation service and optimize the navigation information service model, which will lay the theory and technology foundation for the united navigation of land, sea and air targets.Keywords: information fusion, united navigation, dynamic path planning, navigation information visualization
Procedia PDF Downloads 28812405 A Study on the Application of Machine Learning and Deep Learning Techniques for Skin Cancer Detection
Authors: Hritwik Ghosh, Irfan Sadiq Rahat, Sachi Nandan Mohanty, J. V. R. Ravindra
Abstract:
In the rapidly evolving landscape of medical diagnostics, the early detection and accurate classification of skin cancer remain paramount for effective treatment outcomes. This research delves into the transformative potential of Artificial Intelligence (AI), specifically Deep Learning (DL), as a tool for discerning and categorizing various skin conditions. Utilizing a diverse dataset of 3,000 images representing nine distinct skin conditions, we confront the inherent challenge of class imbalance. This imbalance, where conditions like melanomas are over-represented, is addressed by incorporating class weights during the model training phase, ensuring an equitable representation of all conditions in the learning process. Our pioneering approach introduces a hybrid model, amalgamating the strengths of two renowned Convolutional Neural Networks (CNNs), VGG16 and ResNet50. These networks, pre-trained on the ImageNet dataset, are adept at extracting intricate features from images. By synergizing these models, our research aims to capture a holistic set of features, thereby bolstering classification performance. Preliminary findings underscore the hybrid model's superiority over individual models, showcasing its prowess in feature extraction and classification. Moreover, the research emphasizes the significance of rigorous data pre-processing, including image resizing, color normalization, and segmentation, in ensuring data quality and model reliability. In essence, this study illuminates the promising role of AI and DL in revolutionizing skin cancer diagnostics, offering insights into its potential applications in broader medical domains.Keywords: artificial intelligence, machine learning, deep learning, skin cancer, dermatology, convolutional neural networks, image classification, computer vision, healthcare technology, cancer detection, medical imaging
Procedia PDF Downloads 8612404 The Role of Information Technology in the Supply Chain Management
Authors: Azar Alizadeh, Mohammad Reza Naserkhaki
Abstract:
The application of the IT systems for collecting and analyzing the data can have a significant effect on the performance of any company. In recent decade, different advancements and achievements in the field of information technology have changed the industry compared to the previous decade. The adoption and application of the information technology are one of the ways to achieve a distinctive competitive personality to the companies and their supply chain. The acceptance of the IT and its proper implementation cam reinforce and improve the cooperation between different parts of the supply chain by rapid transfer and distribution of the precise information and the application of the informational systems, leading to the increase in the supply chain efficiency. The main objective of this research is to study the effects and applications of the information technology on and in the supply chain management and to introduce the effective factors on the acceptance of information technology in the companies. Moreover, in order to understand the subject, we will investigate the role and importance of the information and electronic commerce in the supply chain and the characteristics of the supply chain based on the information flow approach.Keywords: electronic commerce, industry, information technology, management, supply chain, system
Procedia PDF Downloads 48512403 Valorization of Waste and By-products for Protein Extraction and Functional Properties
Authors: Lorena Coelho, David Ramada, Catarina Nobre, Joaquim Gaião, Juliana Duarte
Abstract:
The development of processes that allows the valorization of waste and by-products generated by industries is crucial to promote symbiotic relationships between different sectors and is mandatory to “close the loop” in the circular economy paradigm. In recent years, by-products and waste from agro-food and forestry sector have attracted attention due to their potential application and technical characteristics. The extraction of bio-based active compounds to be reused is in line with the circular bioeconomy concept trends, combining the use of renewable resources with the process’s circularity, aiming the waste reduction and encouraging reuse and recycling. Among different types of bio-based materials, which are being explored and can be extracted, proteins fractions are becoming an attractive new raw material. Within this context, BioTrace4Leather project, a collaboration between two Technological Centres – CeNTI and CTIC, and a company of Tanning and Finishing of Leather – Curtumes Aveneda, aims to develop innovative and biologically sustainable solutions for leather industry and accomplish the market circularity trends. Specifically, it aims to the valorisation of waste and by-products from the tannery industry through proteins extraction and the development of an innovative and biologically sustainable materials. The achieved results show that keratin, gelatine, and collagen fractions can be successfully extracted from hair and leather bovine waste. These products could be reintegrated into the industrial manufacturing process to attain innovative and functional textile and leather substrates. ACKNOWLEDGEMENT This work has been developed under BioTrace4Leather scope, a project co-funded by Operational Program for Competitiveness and Internationalization (COMPETE) of PORTUGAL2020, through the European Regional Development Fund (ERDF), under grant agreement Nº POCI-01-0247-FEDER-039867.Keywords: leather by-products, circular economy, sustainability, protein fractions
Procedia PDF Downloads 15812402 Antioxidant Properties of Rice Bran Oil Using Various Heat Treatments
Authors: Supakan Rattanakon, Jakkrapan Boonpimon, Akkaragiat Bhuangsaeng, Aphiwat Ratriphruek
Abstract:
Rice bran oil (RBO) has been found to lower the level of serum cholesterol, has antioxidant and anti-carcinogenic property, and attenuate allergic inflammation. These properties of RBO are due to antioxidant compositions, especially, phenolic compounds. The higher amount of these active compounds in RBO, the greater value of RBO is. Thermal process of rice bran before solvent RBO extraction has been found to have a higher phenolic contents. Therefore, the purpose of this study is to using different heating methods on rice bran before the solvent extraction. Then, % yield of RBO, total phenolic content (TPC), and antioxidant property of two white Thai rice; KDML105 and RD6 were determined. The Folin-Ciocalteu colorimetric assay was used to determine TPC and scavenging of free radicals (DPPH) was used to determine antioxidant property expressed as EC50. The result showed that thermal process did not increase % yield of RBO but increase the TPC with 1.41 mg gallic acid equivalent (GAEmg-1). The highest TPC was found in KDML105 by using sonicator. The highest antioxidant activity was found in RD6 using autoclave. The EC50 of RBO was 0.04 mg/mL. Further study should be performed on different pretreatments to increase the TPC and antioxidant property.Keywords: antioxidant, rice bran oil, total phenol content, white rice
Procedia PDF Downloads 25312401 The Learning Styles Approach to Math Instruction: Improving Math Achievement and Motivation among Low Achievers in Kuwaiti Elementary Schools
Authors: Eisa M. Al-Balhan, Mamdouh M. Soliman
Abstract:
This study introduced learning styles techniques into mathematics teaching to improve mathematics achievement and motivation among Kuwaiti fourth- and fifth-grade low achievers. The study consisted of two groups. The control group (N = 212) received traditional math tutoring based on a textbook and the tutor’s knowledge of math. The experimental group (N = 209) received math tutoring from instructors trained in the Learning Style™ approach. Three instruments were used: Motivation Scale towards Mathematics; Achievement in Mathematics Test; and the manual of learning style approach indicating the individual’s preferred learning style: AKV, AVK, KAV, KVA, VAK, or VKA. The participating teachers taught to the detected learning style of each student or group. The findings show significant improvement in achievement and motivation towards mathematics in the experimental group. The outcome offers information to variables affecting achievement and motivation towards mathematics and demonstrates the leading role of Kuwait in education within the region.Keywords: elementary school, learning style, math low achievers, SmartWired™, math instruction, motivation
Procedia PDF Downloads 11012400 Application of Groundwater Level Data Mining in Aquifer Identification
Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen
Abstract:
Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.Keywords: aquifer identification, decision tree, groundwater, Fourier transform
Procedia PDF Downloads 15712399 Efficient Energy Extraction Circuit for Impact Harvesting from High Impedance Sources
Authors: Sherif Keddis, Mohamed Azzam, Norbert Schwesinger
Abstract:
Harvesting mechanical energy from footsteps or other impacts is a possibility to enable wireless autonomous sensor nodes. These can be used for a highly efficient control of connected devices such as lights, security systems, air conditioning systems or other smart home applications. They can also be used for accurate location or occupancy monitoring. Converting the mechanical energy into useful electrical energy can be achieved using the piezoelectric effect offering simple harvesting setups and low deflections. The challenge facing piezoelectric transducers is the achievable amount of energy per impact in the lower mJ range and the management of such low energies. Simple setups for energy extraction such as a full wave bridge connected directly to a capacitor are problematic due to the mismatch between high impedance sources and low impedance storage elements. Efficient energy circuits for piezoelectric harvesters are commonly designed for vibration harvesters and require periodic input energies with predictable frequencies. Due to the sporadic nature of impact harvesters, such circuits are not well suited. This paper presents a self-powered circuit that avoids the impedance mismatch during energy extraction by disconnecting the load until the source reaches its charge peak. The switch is implemented with passive components and works independent from the input frequency. Therefore, this circuit is suited for impact harvesting and sporadic inputs. For the same input energy, this circuit stores 150% of the energy in comparison to a directly connected capacitor to a bridge rectifier. The total efficiency, defined as the ratio of stored energy on a capacitor to available energy measured across a matched resistive load, is 63%. Although the resulting energy is already sufficient to power certain autonomous applications, further optimization of the circuit are still under investigation in order to improve the overall efficiency.Keywords: autonomous sensors, circuit design, energy harvesting, energy management, impact harvester, piezoelectricity
Procedia PDF Downloads 15412398 Production of Biodiesel from Avocado Waste in Hossana City, Ethiopia
Authors: Tarikayehu Amanuel, Abraham Mohammed
Abstract:
The production of biodiesel from waste materials is becoming an increasingly important research area in the field of renewable energy. One potential waste material source is avocado, a fruit with a large seed and peel that are typically discarded after consumption. This research aims to investigate the feasibility of using avocado waste as a feedstock for the production of biodiesel. The study focuses on extracting oil from the waste material using the transesterification technique and then characterizing the properties of oil to determine its suitability for conversion to biodiesel. The study was conducted experimentally, and a maximum oil yield of 11.583% (150g of oil produced from 1.295kg of avocado waste powder) was obtained from avocado waste powder at an extraction time of 4hr. An 87% fatty acid methyl ester (biodiesel) conversion was also obtained using a methanol/oil ratio of 6:1, 1.3g NaOH, reaction time 60min, and 65°C reaction temperature. Furthermore, from 145 ml of avocado waste oil, 126.15 ml of biodiesel was produced, indicating a high percentage of conversion (87%). Conclusively, the produced biodiesel showed comparable physical and chemical characteristics to that of standard biodiesel samples considered for the study. The results of this research could help to identify a new source of biofuel production while also addressing the issue of waste disposal in the food industry.Keywords: biodiesel, avocado, transesterification, soxhlet extraction
Procedia PDF Downloads 7012397 Tools for Transparency: The Role of Civic Technology in Increasing the Transparency of the State
Authors: Rebecca Rumbul
Abstract:
The operation of the state can often appear opaque to citizens wishing to access official information, who have to negotiate a path through numerous levels of bureaucracy rationalized through institutional policy to acquire what information they want. Even where individual states have 'Right to Information' legislation guaranteeing citizen access to information, public sector conformity to such laws vary between states and between state organizations. In response to such difficulties in bringing citizens and information together, many NGO's around the world have begun designing and hosting digital portals to facilitate the requesting and receiving of official information. How then, are these 'civic technology' tools affecting the behavior of the state? Are they increasing the transparency of the state? This study looked at 5 Right to Information civic technology sites in Chile, Uruguay, Ukraine, Hungary and the UK, and found that such sites were providing a useful platform to publish official information, but that states were still reluctant to comply with all requests. It concludes that civic technology can be an important tool in increasing the transparency of the state, but that the state must have an institutional commitment to information rights for this to be fully effective.Keywords: digital, ICT, transparency, civic technology
Procedia PDF Downloads 66212396 Transfer Learning for Protein Structure Classification at Low Resolution
Authors: Alexander Hudson, Shaogang Gong
Abstract:
Structure determination is key to understanding protein function at a molecular level. Whilst significant advances have been made in predicting structure and function from amino acid sequence, researchers must still rely on expensive, time-consuming analytical methods to visualise detailed protein conformation. In this study, we demonstrate that it is possible to make accurate (≥80%) predictions of protein class and architecture from structures determined at low (>3A) resolution, using a deep convolutional neural network trained on high-resolution (≤3A) structures represented as 2D matrices. Thus, we provide proof of concept for high-speed, low-cost protein structure classification at low resolution, and a basis for extension to prediction of function. We investigate the impact of the input representation on classification performance, showing that side-chain information may not be necessary for fine-grained structure predictions. Finally, we confirm that high resolution, low-resolution and NMR-determined structures inhabit a common feature space, and thus provide a theoretical foundation for boosting with single-image super-resolution.Keywords: transfer learning, protein distance maps, protein structure classification, neural networks
Procedia PDF Downloads 13612395 The Capacity of Mel Frequency Cepstral Coefficients for Speech Recognition
Authors: Fawaz S. Al-Anzi, Dia AbuZeina
Abstract:
Speech recognition is of an important contribution in promoting new technologies in human computer interaction. Today, there is a growing need to employ speech technology in daily life and business activities. However, speech recognition is a challenging task that requires different stages before obtaining the desired output. Among automatic speech recognition (ASR) components is the feature extraction process, which parameterizes the speech signal to produce the corresponding feature vectors. Feature extraction process aims at approximating the linguistic content that is conveyed by the input speech signal. In speech processing field, there are several methods to extract speech features, however, Mel Frequency Cepstral Coefficients (MFCC) is the popular technique. It has been long observed that the MFCC is dominantly used in the well-known recognizers such as the Carnegie Mellon University (CMU) Sphinx and the Markov Model Toolkit (HTK). Hence, this paper focuses on the MFCC method as the standard choice to identify the different speech segments in order to obtain the language phonemes for further training and decoding steps. Due to MFCC good performance, the previous studies show that the MFCC dominates the Arabic ASR research. In this paper, we demonstrate MFCC as well as the intermediate steps that are performed to get these coefficients using the HTK toolkit.Keywords: speech recognition, acoustic features, mel frequency, cepstral coefficients
Procedia PDF Downloads 25912394 Arsenic Speciation in Cicer arietinum: A Terrestrial Legume That Contains Organoarsenic Species
Authors: Anjana Sagar
Abstract:
Arsenic poisoned ground water is a major concern in South Asia. The arsenic enters the food chain not only through drinking but also by using arsenic polluted water for irrigation. Arsenic is highly toxic in its inorganic forms; however, organic forms of arsenic are comparatively less toxic. In terrestrial plants, inorganic form of arsenic is predominantly found; however, we found that significant proportion of organic arsenic was present in root and shoot of a staple legume, chickpea (Cicer arientinum L) plants. Chickpea plants were raised in pot culture on soils spiked with arsenic ranging from 0-70 mg arsenate per Kg soil. Total arsenic concentrations of chickpea shoots and roots were determined by inductively coupled plasma-mass-spectrometry (ICP-MS) ranging from 0.76 to 20.26, and 2.09 to 16.43 µg g⁻¹ dry weight, respectively. Information on arsenic species was acquired by methanol/water extraction method, with arsenic species being analyzed by high-performance liquid chromatography (HPLC) coupled with ICP-MS. Dimethylarsinic acid (DMA) was the only organic arsenic species found in amount from 0.02 to 3.16 % of total arsenic shoot concentration and 0 to 6.93 % of total arsenic root concentration, respectively. To investigate the source of the organic arsenic in chickpea plants, arsenic species in the rhizosphere of soils of plants were also examined. The absence of organic arsenic in soils would suggest the possibility of formation of DMA in plants. The present investigation provides useful information for better understanding of distribution of arsenic species in terrestrial legume plants.Keywords: arsenic, arsenic speciation, dimethylarsinic acid, organoarsenic
Procedia PDF Downloads 13812393 An Investigation for Information Asymmetry Nexus IPO Under-Pricing: A Case of Pakistan
Authors: Saqib Mehmood, Naveed Iqbal Chaudhry, Asif Mehmood
Abstract:
This study intends to investigate the information asymmetry theories of IPO and under-pricing in Pakistan. The purpose of the study is to validate the information asymmetry about firm value which leads to under-pricing. A total of 55 IPOs listed from 2000-2011 were included in this study. OLS multiple regression was applied to achieve the objectives of this study. The findings of the study confirm the significance of information asymmetry on under-pricing in Pakistan. The findings have implications for issuing firms and prospective investors.Keywords: information asymmetry, initial public offerings, under-pricing, firm value
Procedia PDF Downloads 48112392 Impact of Artificial Intelligence Technologies on Information-Seeking Behaviors and the Need for a New Information Seeking Model
Authors: Mohammed Nasser Al-Suqri
Abstract:
Former information-seeking models are proposed more than two decades ago. These already existed models were given prior to the evolution of digital information era and Artificial Intelligence (AI) technologies. Lack of current information seeking models within Library and Information Studies resulted in fewer advancements for teaching students about information-seeking behaviors, design of library tools and services. In order to better facilitate the aforementioned concerns, this study aims to propose state-of-the-art model while focusing on the information seeking behavior of library users in the Sultanate of Oman. This study aims for the development, designing and contextualizing the real-time user-centric information seeking model capable of enhancing information needs and information usage along with incorporating critical insights for the digital library practices. Another aim is to establish far-sighted and state-of-the-art frame of reference covering Artificial Intelligence (AI) while synthesizing digital resources and information for optimizing information-seeking behavior. The proposed study is empirically designed based on a mix-method process flow, technical surveys, in-depth interviews, focus groups evaluations and stakeholder investigations. The study data pool is consist of users and specialist LIS staff at 4 public libraries and 26 academic libraries in Oman. The designed research model is expected to facilitate LIS by assisting multi-dimensional insights with AI integration for redefining the information-seeking process, and developing a technology rich model.Keywords: artificial intelligence, information seeking, information behavior, information seeking models, libraries, Sultanate of Oman
Procedia PDF Downloads 11512391 Feature Extraction and Impact Analysis for Solid Mechanics Using Supervised Finite Element Analysis
Authors: Edward Schwalb, Matthias Dehmer, Michael Schlenkrich, Farzaneh Taslimi, Ketron Mitchell-Wynne, Horen Kuecuekyan
Abstract:
We present a generalized feature extraction approach for supporting Machine Learning (ML) algorithms which perform tasks similar to Finite-Element Analysis (FEA). We report results for estimating the Head Injury Categorization (HIC) of vehicle engine compartments across various impact scenarios. Our experiments demonstrate that models learned using features derived with a simple discretization approach provide a reasonable approximation of a full simulation. We observe that Decision Trees could be as effective as Neural Networks for the HIC task. The simplicity and performance of the learned Decision Trees could offer a trade-off of a multiple order of magnitude increase in speed and cost improvement over full simulation for a reasonable approximation. When used as a complement to full simulation, the approach enables rapid approximate feedback to engineering teams before submission for full analysis. The approach produces mesh independent features and is further agnostic of the assembly structure.Keywords: mechanical design validation, FEA, supervised decision tree, convolutional neural network.
Procedia PDF Downloads 13912390 Rural Women’s Skill Acquisition in the Processing of Locust Bean in Ipokia Local Government Area of Ogun State, Nigeria
Authors: A. A. Adekunle, A. M. Omoare, W. O. Oyediran
Abstract:
This study was carried out to assess rural women’s skill acquisition in the processing of locust bean in Ipokia Local Government Area of Ogun State, Nigeria. Simple random sampling technique was used to select 90 women locust bean processors for this study. Data were analyzed with descriptive statistics and Pearson Product Moment Correlation. The result showed that the mean age of respondents was 40.72 years. Most (70.00%) of the respondents were married. The mean processing experience was 8.63 years. 93.30% of the respondents relied on information from fellow locust beans processors and friends. All (100%) the respondents did not acquire improved processing skill through trainings and workshops. It can be concluded that the rural women’s skill acquisition on modernized processing techniques was generally low. It is hereby recommend that the rural women processors should be trained by extension service providers through series of workshops and seminars on improved processing techniques.Keywords: locust bean, processing, skill acquisition, rural women
Procedia PDF Downloads 46112389 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation
Authors: Giuseppina Settanni, Antonio Panarese, Raffaele Vaira, Maurizio Galiano
Abstract:
Nowdays, artificial intelligence is used successfully in academia and industry for its ability to learn from a large amount of data. In particular, in recent years the use of machine learning algorithms in the field of e-commerce has spread worldwide. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a chatbot and a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. The recommendation systems perform the important function of automatically filtering and personalizing information, thus allowing to manage with the IT overload to which the user is exposed on a daily basis. Recently, international research has experimented with the use of machine learning technologies with the aim to increase the potential of traditional recommendation systems. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Artificial intelligence algorithms have been implemented and trained on historical data collected from user browsing. Finally, the testing phase allowed to validate the implemented model, which will be further tested by letting customers use it.Keywords: machine learning, recommender system, software platform, support vector machine
Procedia PDF Downloads 13412388 Runtime Monitoring Using Policy-Based Approach to Control Information Flow for Mobile Apps
Authors: Mohamed Sarrab, Hadj Bourdoucen
Abstract:
Mobile applications are verified to check the correctness or evaluated to check the performance with respect to specific security properties such as availability, integrity, and confidentiality. Where they are made available to the end users of the mobile application is achievable only to a limited degree using software engineering static verification techniques. The more sensitive the information, such as credit card data, personal medical information or personal emails being processed by mobile application, the more important it is to ensure the confidentiality of this information. Monitoring non-trusted mobile application during execution in an environment where sensitive information is present is difficult and unnerving. The paper addresses the issue of monitoring and controlling the flow of confidential information during non-trusted mobile application execution. The approach concentrates on providing a dynamic and usable information security solution by interacting with the mobile users during the run-time of mobile application in response to information flow events.Keywords: mobile application, run-time verification, usable security, direct information flow
Procedia PDF Downloads 381