Search results for: multi-juice extractor
41 Design and Characterization of a CMOS Process Sensor Utilizing Vth Extractor Circuit
Authors: Rohana Musa, Yuzman Yusoff, Chia Chieu Yin, Hanif Che Lah
Abstract:
This paper presents the design and characterization of a low power Complementary Metal Oxide Semiconductor (CMOS) process sensor. The design is targeted for implementation using Silterra’s 180 nm CMOS process technology. The proposed process sensor employs a voltage threshold (Vth) extractor architecture for detection of variations in the fabrication process. The process sensor generates output voltages in the range of 401 mV (fast-fast corner) to 443 mV (slow-slow corner) at nominal condition. The power dissipation for this process sensor is 6.3 µW with a supply voltage of 1.8V with a silicon area of 190 µm X 60 µm. The preliminary result of this process sensor that was fabricated indicates a close resemblance between test and simulated results.Keywords: CMOS process sensor, PVT sensor, threshold extractor circuit, Vth extractor circuit
Procedia PDF Downloads 17540 Seashore Debris Detection System Using Deep Learning and Histogram of Gradients-Extractor Based Instance Segmentation Model
Authors: Anshika Kankane, Dongshik Kang
Abstract:
Marine debris has a significant influence on coastal environments, damaging biodiversity, and causing loss and damage to marine and ocean sector. A functional cost-effective and automatic approach has been used to look up at this problem. Computer vision combined with a deep learning-based model is being proposed to identify and categorize marine debris of seven kinds on different beach locations of Japan. This research compares state-of-the-art deep learning models with a suggested model architecture that is utilized as a feature extractor for debris categorization. The model is being proposed to detect seven categories of litter using a manually constructed debris dataset, with the help of Mask R-CNN for instance segmentation and a shape matching network called HOGShape, which can then be cleaned on time by clean-up organizations using warning notifications of the system. The manually constructed dataset for this system is created by annotating the images taken by fixed KaKaXi camera using CVAT annotation tool with seven kinds of category labels. A pre-trained HOG feature extractor on LIBSVM is being used along with multiple templates matching on HOG maps of images and HOG maps of templates to improve the predicted masked images obtained via Mask R-CNN training. This system intends to timely alert the cleanup organizations with the warning notifications using live recorded beach debris data. The suggested network results in the improvement of misclassified debris masks of debris objects with different illuminations, shapes, viewpoints and litter with occlusions which have vague visibility.Keywords: computer vision, debris, deep learning, fixed live camera images, histogram of gradients feature extractor, instance segmentation, manually annotated dataset, multiple template matching
Procedia PDF Downloads 10639 Statistical Feature Extraction Method for Wood Species Recognition System
Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof
Abstract:
Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images
Procedia PDF Downloads 42538 Evaluation of an Air Energy Recovery System in Greenhouse Fed by an Axial Air Extractor
Authors: Eugueni Romantchik, Gilbero Lopez, Diego Terrazas
Abstract:
The residual wind energy recovery from axial air extractors in greenhouses represents a constant source of clean energy production, which reduces production costs by reducing energy consumption costs. The objective of this work is to design, build and evaluate a residual wind energy recovery system. This system consists of a wind turbine placed at an optimal distance, a cone in the air discharge and a mechanism to vary the blades angle of the wind turbine. The system energy balance was analyzed, measuring the main energy parameters such as voltage, amperage, air velocities and angular speeds of the rotors. Tests were carried in a greenhouse with extractor Multifan 130 (1.2 kW, 550 rpm and 1.3 m of diameter) without cone and with cone, with the wind turbine (3 blades with 1.2 m in diameter). The implementation of the system allowed recovering up to 55% of the motor's energy. With the cone installed, the electric energy recovered was increased by 10%. Experimentally, it was shown that changing in 3 degrees the original angle of the wind turbine blades, the angular velocity increases 17.7%.Keywords: air energy, exhaust fan, greenhouse, wind turbine
Procedia PDF Downloads 16337 Three-Stage Mining Metals Supply Chain Coordination and Product Quality Improvement with Revenue Sharing Contract
Authors: Hamed Homaei, Iraj Mahdavi, Ali Tajdin
Abstract:
One of the main concerns of miners is to increase the quality level of their products because the mining metals price depends on their quality level; however, increasing the quality level of these products has different costs at different levels of the supply chain. These costs usually increase after extractor level. This paper studies the coordination issue of a decentralized three-level supply chain with one supplier (extractor), one mineral processor and one manufacturer in which the increasing product quality level cost at the processor level is higher than the supplier and at the level of the manufacturer is more than the processor. We identify the optimal product quality level for each supply chain member by designing a revenue sharing contract. Finally, numerical examples show that the designed contract not only increases the final product quality level but also provides a win-win condition for all supply chain members and increases the whole supply chain profit.Keywords: three-stage supply chain, product quality improvement, channel coordination, revenue sharing
Procedia PDF Downloads 18336 Equivalent Electrical Model of a Shielded Pulse Planar Transformer in Isolated Gate Drivers for SiC MOSFETs
Authors: Loreine Makki, Marc Anthony Mannah, Christophe Batard, Nicolas Ginot, Julien Weckbrodt
Abstract:
Planar transformers are extensively utilized in high-frequency, high power density power electronic converters. The breakthrough of wide-bandgap technology compelled power electronic system miniaturization while inducing pivotal effects on system modeling and manufacturing within the power electronics industry. A significant consideration to simulate and model the unanticipated parasitic parameters emerges with the requirement to mitigate electromagnetic disturbances. This paper will present an equivalent circuit model of a shielded pulse planar transformer quantifying leakage inductance and resistance in addition to the interwinding capacitance of the primary and secondary windings. ANSYS Q3D Extractor was utilized to model and simulate the transformer, intending to study the immunity of the simulated equivalent model to high dv/dt occurrences. A convenient correlation between simulation and experimental results is presented.Keywords: Planar transformers, wide-band gap, equivalent circuit model, shielded, ANSYS Q3D Extractor, dv/dt
Procedia PDF Downloads 20635 Comparative Evaluation of a Dynamic Navigation System Versus a Three-Dimensional Microscope in Retrieving Separated Endodontic Files: An in Vitro Study
Authors: Mohammed H. Karim, Bestoon M. Faraj
Abstract:
Introduction: This study aimed to compare the effectiveness of a Dynamic Navigation System (DNS) and a three-dimensional microscope in retrieving broken rotary NiTi files when using trepan burs and the extractor system. Materials and Methods: Thirty maxillary first bicuspids with sixty separate roots were split into two comparable groups based on a comprehensive Cone-Beam Computed Tomography (CBCT) analysis of the root length and curvature. After standardized access opening, glide paths, and patency attainment with the K file (sizes 10 and 15), the teeth were arranged on 3D models (three per quadrant, six per model). Subsequently, controlled-memory heat-treated NiTi rotary files (#25/0.04) were notched 4 mm from the tips and fractured at the apical third of the roots. The C-FR1 Endo file removal system was employed under both guidance to retrieve the fragments, and the success rate, canal aberration, treatment time and volumetric changes were measured. The statistical analysis was performed using IBM SPSS software at a significance level of 0.05. Results: The microscope-guided group had a higher success rate than the DNS guidance, but the difference was insignificant (p > 0.05). In addition, the microscope-guided drills resulted in a substantially lower proportion of canal aberration, required less time to retrieve the fragments and caused minimal change in the root canal volume (p < 0.05). Conclusion: Although dynamically guided trephining with the extractor can retrieve separated instruments, it is inferior to three-dimensional microscope guidance regarding treatment time, procedural errors, and volume change.Keywords: separated instruments retrieval, dynamic navigation system, 3D video microscope, trephine burs, extractor
Procedia PDF Downloads 6934 Robustness of the Deep Chroma Extractor and Locally-Normalized Quarter Tone Filters in Automatic Chord Estimation under Reverberant Conditions
Authors: Luis Alvarado, Victor Poblete, Isaac Gonzalez, Yetzabeth Gonzalez
Abstract:
In MIREX 2016 (http://www.music-ir.org/mirex), the deep neural network (DNN)-Deep Chroma Extractor, proposed by Korzeniowski and Wiedmer, reached the highest score in an audio chord recognition task. In the present paper, this tool is assessed under acoustic reverberant environments and distinct source-microphone distances. The evaluation dataset comprises The Beatles and Queen datasets. These datasets are sequentially re-recorded with a single microphone in a real reverberant chamber at four reverberation times (0 -anechoic-, 1, 2, and 3 s, approximately), as well as four source-microphone distances (32, 64, 128, and 256 cm). It is expected that the performance of the trained DNN will dramatically decrease under these acoustic conditions with signals degraded by room reverberation and distance to the source. Recently, the effect of the bio-inspired Locally-Normalized Cepstral Coefficients (LNCC), has been assessed in a text independent speaker verification task using speech signals degraded by additive noise at different signal-to-noise ratios with variations of recording distance, and it has also been assessed under reverberant conditions with variations of recording distance. LNCC showed a performance so high as the state-of-the-art Mel Frequency Cepstral Coefficient filters. Based on these results, this paper proposes a variation of locally-normalized triangular filters called Locally-Normalized Quarter Tone (LNQT) filters. By using the LNQT spectrogram, robustness improvements of the trained Deep Chroma Extractor are expected, compared with classical triangular filters, and thus compensating the music signal degradation improving the accuracy of the chord recognition system.Keywords: chord recognition, deep neural networks, feature extraction, music information retrieval
Procedia PDF Downloads 23233 Comparative Evaluation of a Dynamic Navigation System Versus a Three-Dimensional Microscope in Retrieving Separated Endodontic Files: An in Vitro Study
Authors: Mohammed H. Karim, Bestoon M. Faraj
Abstract:
Introduction: instrument separation is a common challenge in the endodontic field. Various techniques and technologies have been developed to improve the retrieval success rate. This study aimed to compare the effectiveness of a Dynamic Navigation System (DNS) and a three-dimensional microscope in retrieving broken rotary NiTi files when using trepan burs and the extractor system. Materials and Methods: Thirty maxillary first bicuspids with sixty separate roots were split into two comparable groups based on a comprehensive Cone-Beam Computed Tomography (CBCT) analysis of the root length and curvature. After standardised access opening, glide paths, and patency attainment with the K file (sizes 10 and 15), the teeth were arranged on 3D models (three per quadrant, six per model). Subsequently, controlled-memory heat-treated NiTi rotary files (#25/0.04) were notched 4 mm from the tips and fractured at the apical third of the roots. The C-FR1 Endo file removal system was employed under both guidance to retrieve the fragments, and the success rate, canal aberration, treatment time and volumetric changes were measured. The statistical analysis was performed using IBM SPSS software at a significance level of 0.05. Results: The microscope-guided group had a higher success rate than the DNS guidance, but the difference was insignificant (p > 0.05). In addition, the microscope-guided drills resulted in a substantially lower proportion of canal aberration, required less time to retrieve the fragments and caused a minor change in the root canal volume (p < 0.05). Conclusion: Although dynamically guided trephining with the extractor can retrieve separated instruments, it is inferior to three-dimensional microscope guidance regarding treatment time, procedural errors, and volume change.Keywords: dynamic navigation system, separated instruments retrieval, trephine burs and extractor system, three-dimensional video microscope
Procedia PDF Downloads 9832 Measuring the Effect of Ventilation on Cooking in Indoor Air Quality by Low-Cost Air Sensors
Authors: Andres Gonzalez, Adam Boies, Jacob Swanson, David Kittelson
Abstract:
The concern of the indoor air quality (IAQ) has been increasing due to its risk to human health. The smoking, sweeping, and stove and stovetop use are the activities that have a major contribution to the indoor air pollution. Outdoor air pollution also affects IAQ. The most important factors over IAQ from cooking activities are the materials, fuels, foods, and ventilation. The low-cost, mobile air quality monitoring (LCMAQM) sensors, is reachable technology to assess the IAQ. This is because of the lower cost of LCMAQM compared to conventional instruments. The IAQ was assessed, using LCMAQM, during cooking activities in a University of Minnesota graduate-housing evaluating different ventilation systems. The gases measured are carbon monoxide (CO) and carbon dioxide (CO2). The particles measured are particle matter (PM) 2.5 micrometer (µm) and lung deposited surface area (LDSA). The measurements are being conducted during April 2019 in Como Student Community Cooperative (CSCC) that is a graduate housing at the University of Minnesota. The measurements are conducted using an electric stove for cooking. The amount and type of food and oil using for cooking are the same for each measurement. There are six measurements: two experiments measure air quality without any ventilation, two using an extractor as mechanical ventilation, and two using the extractor and windows open as mechanical and natural ventilation. 3The results of experiments show that natural ventilation is most efficient system to control particles and CO2. The natural ventilation reduces the concentration in 79% for LDSA and 55% for PM2.5, compared to the no ventilation. In the same way, CO2 reduces its concentration in 35%. A well-mixed vessel model was implemented to assess particle the formation and decay rates. Removal rates by the extractor were significantly higher for LDSA, which is dominated by smaller particles, than for PM2.5, but in both cases much lower compared to the natural ventilation. There was significant day to day variation in particle concentrations under nominally identical conditions. This may be related to the fat content of the food. Further research is needed to assess the impact of the fat in food on particle generations.Keywords: cooking, indoor air quality, low-cost sensor, ventilation
Procedia PDF Downloads 11331 Electrospray Plume Characterisation of a Single Source Cone-Jet for Micro-Electronic Cooling
Authors: M. J. Gibbons, A. J. Robinson
Abstract:
Increasing expectations on small form factor electronics to be more compact while increasing performance has driven conventional cooling technologies to a thermal management threshold. An emerging solution to this problem is electrospray (ES) cooling. ES cooling enables two phase cooling by utilising Coulomb forces for energy efficient fluid atomization. Generated charged droplets are accelerated to the grounded target surface by the applied electric field and surrounding gravitational force. While in transit the like charged droplets enable plume dispersion and inhibit droplet coalescence. If the electric field is increased in the cone-jet regime, a subsequent increase in the plume spray angle has been shown. Droplet segregation in the spray plume has been observed, with primary droplets in the plume core and satellite droplets positioned on the periphery of the plume. This segregation is facilitated by inertial and electrostatic effects. This result has been corroborated by numerous authors. These satellite droplets are usually more densely charged and move at a lower relative velocity to that of the spray core due to the radial decay of the electric field. Previous experimental research by Gomez and Tang has shown that the number of droplets deposited on the periphery can be up to twice that of the spray core. This result has been substantiated by a numerical models derived by Wilhelm et al., Oh et al. and Yang et al. Yang et al. showed from their numerical model, that by varying the extractor potential the dispersion radius of the plume also varies proportionally. This research aims to investigate this dispersion density and the role it plays in the local heat transfer coefficient profile (h) of ES cooling. This will be carried out for different extractor – target separation heights (H2), working fluid flow rates (Q), and extractor applied potential (V2). The plume dispersion will be recorded by spraying a 25 µm thick, joule heated steel foil and by recording the thermal footprint of the ES plume using a Flir A-40 thermal imaging camera. The recorded results will then be analysed by in-house developed MATLAB code.Keywords: electronic cooling, electrospray, electrospray plume dispersion, spray cooling
Procedia PDF Downloads 39730 To Study the Effect of Drying Temperature Towards Extraction of Aquilaria subintegra Dry Leaves Using Vacuum Far Infrared
Authors: Tengku Muhammad Rafi Nazmi Bin Tengku Razali, Habsah Alwi
Abstract:
This article based on effect of temperature towards extraction of Aquilaria Subintegra. Aquilaria Subintegra which its main habitat is in Asia-tropical and particularly often found in its native which is Thailand. There is claim which is Aquilaria Subintegra contains antipyretic properties that helps fight fever. Research nowadays also shown that paracetamol consumed bring bad effect towards consumers. This sample will first dry using Vacuum Far Infrared which provides better drying than conventional oven. Soxhlet extractor used to extract oil from sample. Gas Chromatography Mass Spectrometer used to analyze sample to determine its compound. Objective from this research was to determine the active ingredients that exist in the Aquilaria Subintegra leaves and to determine whether compound of Acetaminophen exist or not inside the leaves. Moisture content from 400C was 80%, 500C was 620% and 600C was 36%. The greater temperature resulting lower moisture content inside sample leaves. 7 components were identified in sample T=400C while only 5 components were identified in sample at T=50C and T=60C. Four components were commonly identified in three sample which is 1n-Hexadecanoic acid, 9,12,15-Octadecatrienoic acid, methyl ester (z,z,z), Vitamin E and Squalene. Further studies are needed with new series of temperature to refine the best results.Keywords: aquilaria subintegra, vacuum far infrared, SOXHLET extractor, gas chromatography mass spectrometer, paracetamol
Procedia PDF Downloads 48429 On Exploring Search Heuristics for improving the efficiency in Web Information Extraction
Authors: Patricia Jiménez, Rafael Corchuelo
Abstract:
Nowadays the World Wide Web is the most popular source of information that relies on billions of on-line documents. Web mining is used to crawl through these documents, collect the information of interest and process it by applying data mining tools in order to use the gathered information in the best interest of a business, what enables companies to promote theirs. Unfortunately, it is not easy to extract the information a web site provides automatically when it lacks an API that allows to transform the user-friendly data provided in web documents into a structured format that is machine-readable. Rule-based information extractors are the tools intended to extract the information of interest automatically and offer it in a structured format that allow mining tools to process it. However, the performance of an information extractor strongly depends on the search heuristic employed since bad choices regarding how to learn a rule may easily result in loss of effectiveness and/or efficiency. Improving search heuristics regarding efficiency is of uttermost importance in the field of Web Information Extraction since typical datasets are very large. In this paper, we employ an information extractor based on a classical top-down algorithm that uses the so-called Information Gain heuristic introduced by Quinlan and Cameron-Jones. Unfortunately, the Information Gain relies on some well-known problems so we analyse an intuitive alternative, Termini, that is clearly more efficient; we also analyse other proposals in the literature and conclude that none of them outperforms the previous alternative.Keywords: information extraction, search heuristics, semi-structured documents, web mining.
Procedia PDF Downloads 33528 Optimization of Multistage Extractor for the Butanol Separation from Aqueous Solution Using Ionic Liquids
Authors: Dharamashi Rabari, Anand Patel
Abstract:
n-Butanol can be regarded as a potential biofuel. Being resistive to corrosion and having high calorific value, butanol is a very attractive energy source as opposed to ethanol. By fermentation process called ABE (acetone, butanol, ethanol), bio-butanol can be produced. ABE carried out mostly by bacteria Clostridium acetobutylicum. The major drawback of the process is the butanol concentration higher than 10 g/L, delays the growth of microbes resulting in a low yield. It indicates the simultaneous separation of butanol from the fermentation broth. Two hydrophobic Ionic Liquids (ILs) 1-butyl-1-methylpiperidinium bis (trifluoromethylsulfonyl)imide [bmPIP][Tf₂N] and 1-hexyl-3-methylimidazolium bis (trifluoromethylsulfonyl)imide [hmim][Tf₂N] were chosen. The binary interaction parameters for both ternary systems i.e. [bmPIP][Tf₂N] + water + n-butanol and [hmim][Tf₂N] + water +n-butanol were taken from the literature that was generated by NRTL model. Particle swarm optimization (PSO) with the isothermal sum rate (ISR) method was used to optimize the cost of liquid-liquid extractor. For [hmim][Tf₂N] + water +n-butanol system, PSO shows 84% success rate with the number of stages equal to eight and solvent flow rate equal to 461 kmol/hr. The number of stages was three with 269.95 kmol/hr solvent flow rate for [bmPIP][Tf₂N] + water + n-butanol system. Moreover, both ILs were very efficient as the loss of ILs in raffinate phase was negligible.Keywords: particle swarm optimization, isothermal sum rate method, success rate, extraction
Procedia PDF Downloads 12227 Medium-Scale Multi-Juice Extractor for Food Processing
Authors: Flordeliza L. Mercado, Teresito G. Aguinaldo, Helen F. Gavino, Victorino T. Taylan
Abstract:
Most fruits and vegetables are available in large quantities during peak season which are oftentimes marketed at low price and left to rot or fed to farm animals. The lack of efficient storage facilities, and the additional cost and unavailability of small machinery for food processing, results to low price and wastage. Incidentally, processed fresh fruits and vegetables are gaining importance nowadays and health conscious people are also into ‘juicing’. One way to reduce wastage and ensure an all-season availability of crop juices at reasonable costs is to develop equipment for effective extraction of juice. The study was conducted to design, fabricate and evaluate a multi-juice extractor using locally available materials, making it relatively cheaper and affordable for medium-scale enterprises. The study was also conducted to formulate juice blends using extracted juices and calamansi juice at different blending percentage, and evaluate its chemical properties and sensory attributes. Furthermore, the chemical properties of extracted meals were evaluated for future applications. The multi-juice extractor has an overall dimension of 963mm x 300mm x 995mm, a gross weight of 82kg and 5 major components namely; feeding hopper, extracting chamber, juice and meal outlet, transmission assembly, and frame. The machine performance was evaluated based on juice recovery, extraction efficiency, extraction rate, extraction recovery, and extraction loss considering type of crop as apple and carrot with three replications each and was analyzed using T-test. The formulated juice blends were subjected to sensory evaluation and data gathered were analyzed using Analysis of Variance appropriate for Complete Randomized Design. Results showed that the machine’s juice recovery (73.39%), extraction rate (16.40li/hr), and extraction efficiency (88.11%) for apple were significantly higher than for carrot while extraction recovery (99.88%) was higher for apple than for carrot. Extraction loss (0.12%) was lower for apple than for carrot, but was not significantly affected by crop. Based on adding percentage mark-up on extraction cost (Php 2.75/kg), the breakeven weight and payback period for a 35% mark-up is 4,710.69kg and 1.22 years, respectively and for a 50% mark-up, the breakeven weight is 3,492.41kg and the payback period is 0.86 year (10.32 months). Results on the sensory evaluation of juice blends showed that the type of juice significantly influenced all the sensory parameters while the blending percentage including their respective interaction, had no significant effect on all sensory parameters, making the apple-calamansi juice blend more preferred than the carrot-calamansi juice blend in terms of all the sensory parameter. The machine’s performance is higher for apple than for carrot and the cost analysis on the use of the machine revealed that it is financially viable with a payback period of 1.22 years (35% mark-up) and 0.86 year (50% mark-up) for machine cost, generating an income of Php 23,961.60 and Php 34,444.80 per year using 35% and 50% mark-up, respectively. The juice blends were of good qualities based on the values obtained in the chemical analysis and the extracted meal could also be used to produce another product based on the values obtained from proximate analysis.Keywords: food processing, fruits and vegetables, juice extraction, multi-juice extractor
Procedia PDF Downloads 32426 Simulation of Multistage Extraction Process of Co-Ni Separation Using Ionic Liquids
Authors: Hongyan Chen, Megan Jobson, Andrew J. Masters, Maria Gonzalez-Miquel, Simon Halstead, Mayri Diaz de Rienzo
Abstract:
Ionic liquids offer excellent advantages over conventional solvents for industrial extraction of metals from aqueous solutions, where such extraction processes bring opportunities for recovery, reuse, and recycling of valuable resources and more sustainable production pathways. Recent research on the use of ionic liquids for extraction confirms their high selectivity and low volatility, but there is relatively little focus on how their properties can be best exploited in practice. This work addresses gaps in research on process modelling and simulation, to support development, design, and optimisation of these processes, focusing on the separation of the highly similar transition metals, cobalt, and nickel. The study exploits published experimental results, as well as new experimental results, relating to the separation of Co and Ni using trihexyl (tetradecyl) phosphonium chloride. This extraction agent is attractive because it is cheaper, more stable and less toxic than fluorinated hydrophobic ionic liquids. This process modelling work concerns selection and/or development of suitable models for the physical properties, distribution coefficients, for mass transfer phenomena, of the extractor unit and of the multi-stage extraction flowsheet. The distribution coefficient model for cobalt and HCl represents an anion exchange mechanism, supported by the literature and COSMO-RS calculations. Parameters of the distribution coefficient models are estimated by fitting the model to published experimental extraction equilibrium results. The mass transfer model applies Newman’s hard sphere model. Diffusion coefficients in the aqueous phase are obtained from the literature, while diffusion coefficients in the ionic liquid phase are fitted to dynamic experimental results. The mass transfer area is calculated from the surface to mean diameter of liquid droplets of the dispersed phase, estimated from the Weber number inside the extractor. New experiments measure the interfacial tension between the aqueous and ionic phases. The empirical models for predicting the density and viscosity of solutions under different metal loadings are also fitted to new experimental data. The extractor is modelled as a continuous stirred tank reactor with mass transfer between the two phases and perfect phase separation of the outlet flows. A multistage separation flowsheet simulation is set up to replicate a published experiment and compare model predictions with the experimental results. This simulation model is implemented in gPROMS software for dynamic process simulation. The results of single stage and multi-stage flowsheet simulations are shown to be in good agreement with the published experimental results. The estimated diffusion coefficient of cobalt in the ionic liquid phase is in reasonable agreement with published data for the diffusion coefficients of various metals in this ionic liquid. A sensitivity study with this simulation model demonstrates the usefulness of the models for process design. The simulation approach has potential to be extended to account for other metals, acids, and solvents for process development, design, and optimisation of extraction processes applying ionic liquids for metals separations, although a lack of experimental data is currently limiting the accuracy of models within the whole framework. Future work will focus on process development more generally and on extractive separation of rare earths using ionic liquids.Keywords: distribution coefficient, mass transfer, COSMO-RS, flowsheet simulation, phosphonium
Procedia PDF Downloads 19025 SIFT and Perceptual Zoning Applied to CBIR Systems
Authors: Simone B. K. Aires, Cinthia O. de A. Freitas, Luiz E. S. Oliveira
Abstract:
This paper contributes to the CBIR systems applied to trademark retrieval. The proposed model includes aspects from visual perception of the shapes, by means of feature extractor associated to a non-symmetrical perceptual zoning mechanism based on the Principles of Gestalt. Thus, the feature set were performed using Scale Invariant Feature Transform (SIFT). We carried out experiments using four different zonings strategies (Z = 4, 5H, 5V, 7) for matching and retrieval tasks. Our proposal method achieved the normalized recall (Rn) equal to 0.84. Experiments show that the non-symmetrical zoning could be considered as a tool to build more reliable trademark retrieval systems.Keywords: CBIR, Gestalt, matching, non-symmetrical zoning, SIFT
Procedia PDF Downloads 31324 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 2823 Development of Locally Fabricated Honey Extracting Machine
Authors: Akinfiresoye W. A., Olarewaju O. O., Okunola, Okunola I. O.
Abstract:
An indigenous honey-extracting machine was designed, fabricated and evaluated at the workshop of the department of Agricultural Technology, Federal Polytechnic, Ile-Oluji, Nigeria using locally available materials. It has the extraction unit, the presser, the honey collector and the frame. The harvested honeycomb is placed inside the cylindrical extraction unit with perforated holes. The press plate was then placed on the comb while the hydraulic press of 3 tons was placed on it, supported by the frame. The hydraulic press, which is manually operated, forces the oil out of the extraction chamber through the perforated holes into the honey collector positioned at the lowest part of the extraction chamber. The honey-extracting machine has an average throughput of 2.59 kg/min and an efficiency of about 91%. The cost of producing the honey extracting machine is NGN 31, 700: 00, thirty-one thousand and seven hundred nairas only or $70 at NGN 452.8 to a dollar. This cost is affordable to beekeepers and would-be honey entrepreneurs. The honey-extracting machine is easy to operate and maintain without any complex technical know-how.Keywords: honey, extractor, cost, efficiency
Procedia PDF Downloads 7722 Alternator Fault Detection Using Wigner-Ville Distribution
Authors: Amin Ranjbar, Amir Arsalan Jalili Zolfaghari, Amir Abolfazl Suratgar, Mehrdad Khajavi
Abstract:
This paper describes two stages of learning-based fault detection procedure in alternators. The procedure consists of three states of machine condition namely shortened brush, high impedance relay and maintaining a healthy condition in the alternator. The fault detection algorithm uses Wigner-Ville distribution as a feature extractor and also appropriate feature classifier. In this work, ANN (Artificial Neural Network) and also SVM (support vector machine) were compared to determine more suitable performance evaluated by the mean squared of errors criteria. Modules work together to detect possible faulty conditions of machines working. To test the method performance, a signal database is prepared by making different conditions on a laboratory setup. Therefore, it seems by implementing this method, satisfactory results are achieved.Keywords: alternator, artificial neural network, support vector machine, time-frequency analysis, Wigner-Ville distribution
Procedia PDF Downloads 37421 Tyrosine Rich Fraction as an Immunomodulatory Agent from Ficus Religiosa Bark
Authors: S. A. Nirmal, G. S. Asane, S. C. Pal, S. C. Mandal
Abstract:
Objective: Ficus religiosa Linn (Moraceae) is being used in traditional medicine to improve immunity hence present work was undertaken to validate this use scientifically. Material and Methods: Dried, powdered bark of F. religiosa was extracted successively using petroleum ether and 70% ethanol in soxhlet extractor. The extracts obtained were screened for immunomodulatory activity by delayed type hypersensitivity (DTH), neutrophil adhesion test and cyclophosphamide-induced neutropenia in Swiss albino mice at the dose of 50 and 100 mg/kg, i.p. 70% ethanol extract showed significant immunostimulant activity hence subjected to column chromatography to produce tyrosine rich fraction (TRF). TRF obtained was screened for immunomodulatory activity by above methods at the dose of 10 mg/kg, i.p. Results: TRF showed potentiation of DTH response in terms of significant increase in the mean difference in foot-pad thickness and it significantly increased neutrophil adhesion to nylon fibers by 48.20%. Percentage reduction in total leukocyte count and neutrophil by TRF was found to be 43.85% and 18.72%, respectively. Conclusion: Immunostimulant activity of TRF was more pronounced and thus it has great potential as a source for natural health products.Keywords: Ficus religiosa, immunomodulatory, cyclophosphamide, neutropenia
Procedia PDF Downloads 44620 Effects of Pressure and Temperature on the Extraction of Benzyl Isothiocyanate by Supercritical Fluids from Tropaeolum majus L. Leaves
Authors: Espinoza S. Clara, Gamarra Q. Flor, Marianela F. Ramos Quispe S. Miguel, Flores R. Omar
Abstract:
Tropaeolum majus L. is a native plant to South and Central America, used since ancient times by our ancestors to combat different diseases. Glucotropaeolonin is one of its main components, which when hydrolyzed, forms benzyl isothiocyanate (BIT) that promotes cellular apoptosis (programmed cell death in cancer cells). Therefore, the present research aims to evaluate the effect of the pressure and temperature of BIT extraction by supercritical CO2 from Tropaeolum majus L. The extraction was carried out in a supercritical fluid extractor equipment Speed SFE BASIC Brand: Poly science, the leaves of Tropaeolum majus L. were ground for one hour and lyophilized until obtaining a humidity of 6%. The extraction with supercritical CO2 was carried out with pressures of 200 bar and 300 bar, temperatures of 50°C, 60°C and 70°C, obtained by the conjugation of these six treatments. BIT was identified by thin layer chromatography using 98% BIT as the standard, and as the mobile phase hexane: dichloromethane (4:2). Subsequently, BIT quantification was performed by high performance liquid chromatography (HPLC). The highest yield of oleoresin by supercritical CO2 extraction was obtained pressure 300 bar and temperature at 60°C; and the higher content of BIT at pressure 200 bar and 70°C for 30 minutes to obtain 113.615 ± 0.03 mg BIT/100 g dry matter was obtained.Keywords: solvent extraction, Tropaeolum majus L., supercritical fluids, benzyl isothiocyanate
Procedia PDF Downloads 43819 Audio-Visual Recognition Based on Effective Model and Distillation
Authors: Heng Yang, Tao Luo, Yakun Zhang, Kai Wang, Wei Qin, Liang Xie, Ye Yan, Erwei Yin
Abstract:
Recent years have seen that audio-visual recognition has shown great potential in a strong noise environment. The existing method of audio-visual recognition has explored methods with ResNet and feature fusion. However, on the one hand, ResNet always occupies a large amount of memory resources, restricting the application in engineering. On the other hand, the feature merging also brings some interferences in a high noise environment. In order to solve the problems, we proposed an effective framework with bidirectional distillation. At first, in consideration of the good performance in extracting of features, we chose the light model, Efficientnet as our extractor of spatial features. Secondly, self-distillation was applied to learn more information from raw data. Finally, we proposed a bidirectional distillation in decision-level fusion. In more detail, our experimental results are based on a multi-model dataset from 24 volunteers. Eventually, the lipreading accuracy of our framework was increased by 2.3% compared with existing systems, and our framework made progress in audio-visual fusion in a high noise environment compared with the system of audio recognition without visual.Keywords: lipreading, audio-visual, Efficientnet, distillation
Procedia PDF Downloads 13418 Conversion of Atmospheric Carbone Dioxide into Minerals at Room Conditions by Using the Sea Water Plus Various Additives
Authors: Muthana A. M. Jamel Al-Gburi
Abstract:
Elimination of carbon dioxide (CO2) gas from the atmosphere is very important but complicated since there is increasing in the amounts of carbon dioxide and other greenhouse gases in the atmosphere, which mainly caused by some of the human activities and the burning of fossil fuels. So that will lead to global warming. The global warming affects the earth temperature causing an increase to a higher level and, at the same time, creates tornadoes and storms. In this project, we are going to do a new technique for extracting carbon dioxide directly from the air and change it to useful minerals and Nano scale fibers made of carbon by using several chemical processes through chemical reactions. So, that could lead to an economical and healthy way to make some valuable building materials. Also, it may even work as a weapon against environmental change. In our device (Carbone Dioxide Domestic Extractor), we are using Ocean-seawater to dissolve the CO₂ gas and then converted it into carbonate minerals by using a number of additives like Shampoo, clay, and MgO. Note that the atmospheric air includes CO₂ gas, has circulated within the seawater by the air pump. More, that we will use a number of chemicals agents to convert the water acid into useful minerals. After we constructed the system, we did intense experiments and investigations to find the optimum chemical agent, which must be work at the environmental condition. Further to that, we will measure the solubility of CO₂ and other salts in the seawater.Keywords: global warming, CO₂ gas, ocean-sea water, additives, solubility level
Procedia PDF Downloads 11117 Optimization of the Production Processes of Biodiesel from a Locally Sourced Gossypium herbaceum and Moringa oleifera
Authors: Ikechukwu Ejim
Abstract:
This research project addresses the optimization of biodiesel production from gossypium herbaceum (cottonseed) and moringa oleifera seeds. Soxhlet extractor method using n-hexane for gossypium herbaceum (cottonseed) and ethanol for moringa oleifera were used for solvent extraction. 1250 ml of oil was realized from both gossypium herbaceum (cottonseed) and moringa oleifera seeds before characterization. In transesterification process, a 4-factor-3-level experiment was conducted using an optimal design of Response Surface Methodology. The effects of methanol/oil molar ratio, catalyst concentration (%), temperature (°C) and time (mins), on the yield of methyl ester for both cottonseed and moringa oleifera oils were determined. The design consisted of 25 experimental runs (5 lack of fit points, five replicate points, 0 additional center points and I optimality) and provided sufficient information to fit a second-degree polynomial model. The experimental results suggested that optimum conditions were as follows; cottonseed yield (96.231%), catalyst concentration (0.972%), temperature (55oC), time (60mins) and methanol/oil molar ratios (8/1) respectively while moringa oleifera optimum values were yield (80.811%), catalyst concentration (1.0%), temperature (54.7oC), time (30mins ) and methanol/oil molar ratios (8/1) respectively. This optimized conditions were validated with the actual biodiesel yield in experimental trials and literature.Keywords: optimization, Gossypium herbaceum, Moringa oleifera, biodiesel
Procedia PDF Downloads 14616 Extracting the Atmospheric Carbon Dioxide and Convert It into Useful Minerals at the Room Conditions
Authors: Muthana A. M. Jamel Al-Gburi
Abstract:
Elimination of carbon dioxide (CO2) gas from our atmosphere is very important but complicated, and since there is always an increase in the gas amounts of the other greenhouse ones in our atmosphere, causes by both some of the human activities and the burning of the fossil fuels, which leads to the Global Warming phenomena i.e., increasing the earth temperature to a higher level, creates desertification, tornadoes and storms. In our present research project, we constructed our own system to extract carbon dioxide directly from the atmospheric air at the room conditions and investigated how to convert the gas into a useful mineral or Nano scale fibers made of carbon by using several chemical processes and chemical reactions leading to a valuable building material and also to mitigate the environmental negative change. In the present water pool system (Carbone Dioxide Domestic Extractor), the ocean-sea water was used to dissolve the CO2 gas from the room and converted into carbonate minerals by using a number of additives like shampoo, clay and MgO. Note that the atmospheric air includes CO2 gas has circulated within the sea water by air pump connected to a perforated tubes fixed deep on the pool base. Those chemical agents were mixed with the ocean-sea water to convert the formed acid from the water-CO2 reaction into a useful mineral. After we successfully constructed the system, we did intense experiments and investigations on the CO2 gas reduction level and found which is the optimum active chemical agent to work in the atmospheric conditions.Keywords: global warming, CO₂ gas, ocean-sea water, additives, solubility level
Procedia PDF Downloads 8015 The Influence of Temperature on Apigenin Extraction from Chamomile (Matricaria recutita) by Superheated Water
Authors: J. Švarc-Gajić, A. Cvetanović
Abstract:
Apigenin is a flavone synthetized by many plants and quite abundant in chamomile (Matricaria recutita) in its free form and in the form of its glucoside and different acylated forms. Many beneficial health effects have been attributed to apigenin, such as chemo-preventive, anxiolytic, anti-inflammatory, antioxidant and antispasmodic. It is reported that free apigenin is much more bioactive in comparison to its bound forms. Subcritical water offers numerous advantages in comparison to conventional extraction techniques, such as good selectivity, low price and safety. Superheated water exhibits high hydrolytical potential which must be carefully balanced when using this solvent for the extraction of bioactive molecules. Moderate hydrolytical potential can be exploited to liberate apigenin from its bound forms, thus increasing biological potential of obtained extracts. The polarity of pressurized water and its hydrolytical potential are highly dependent on the temperature. In this research chamomile ligulate flowers were extracted by pressurized hot water in home-made subcritical water extractor in conditions of convective mass transfer. The influence of the extraction temperature was investigated at 30 bars. Extraction yields of total phenols, total flavonoids and apigenin depending on the operational temperature were calculated based on spectrometric assays. Optimal extraction temperature for maximum yields of total phenols and flavonoids showed to be 160°C, whereas apigenin yield was the highest at 120°C.Keywords: superheated water, temperature, chamomile, apigenin
Procedia PDF Downloads 48114 Demetallization of Crude Oil: Comparative Analysis of Deasphalting and Electrochemical Removal Methods of Ni and V
Authors: Nurlan Akhmetov, Abilmansur Yeshmuratov, Aliya Kurbanova, Gulnar Sugurbekova, Murat Baisariyev
Abstract:
Extraction of the vanadium and nickel compounds is complex due to the high stability of porphyrin, nickel is catalytic poison which deactivates catalysis during the catalytic cracking of the oil, while vanadyl is abrasive and valuable metal. Thus, high concentration of the Ni and V in the crude oil makes their removal relevant. Two methods of the demetallization of crude oil were tested, therefore, the present research is conducted for comparative analysis of the deasphalting with organic solvents (cyclohexane, carbon tetrachloride, chloroform) and electrochemical method. Percentage of Ni extraction reached maximum of approximately 55% by using the electrochemical method in electrolysis cell, which was developed for this research and consists of three sections: oil and protonating agent (EtOH) solution between two conducting membranes which divides it from two capsules of 10% sulfuric acid and two graphite electrodes which cover all three parts in electrical circuit. Ions of metals pass through membranes and remain in acid solutions. The best result was obtained in 60 minutes with ethanol to oil ratio 25% to 75% respectively, current fits in to the range from 0.3A to 0.4A, voltage changed from 12.8V to 17.3V. Maximum efficiency of deasphalting, with cyclohexane as the solvent, in Soxhlet extractor was 66.4% for Ni and 51.2% for V. Thus, applying the voltammetry, ICP MS (Inductively coupled plasma mass spectrometry) and AAS (atomic absorption spectroscopy), these mentioned types of metal extraction methods were compared in this paper.Keywords: electrochemistry, deasphalting of crude oil, demetallization of crude oil, petrolium engineering
Procedia PDF Downloads 23413 Using Autoencoder as Feature Extractor for Malware Detection
Authors: Umm-E-Hani, Faiza Babar, Hanif Durad
Abstract:
Malware-detecting approaches suffer many limitations, due to which all anti-malware solutions have failed to be reliable enough for detecting zero-day malware. Signature-based solutions depend upon the signatures that can be generated only when malware surfaces at least once in the cyber world. Another approach that works by detecting the anomalies caused in the environment can easily be defeated by diligently and intelligently written malware. Solutions that have been trained to observe the behavior for detecting malicious files have failed to cater to the malware capable of detecting the sandboxed or protected environment. Machine learning and deep learning-based approaches greatly suffer in training their models with either an imbalanced dataset or an inadequate number of samples. AI-based anti-malware solutions that have been trained with enough samples targeted a selected feature vector, thus ignoring the input of leftover features in the maliciousness of malware just to cope with the lack of underlying hardware processing power. Our research focuses on producing an anti-malware solution for detecting malicious PE files by circumventing the earlier-mentioned shortcomings. Our proposed framework, which is based on automated feature engineering through autoencoders, trains the model over a fairly large dataset. It focuses on the visual patterns of malware samples to automatically extract the meaningful part of the visual pattern. Our experiment has successfully produced a state-of-the-art accuracy of 99.54 % over test data.Keywords: malware, auto encoders, automated feature engineering, classification
Procedia PDF Downloads 7212 Isolation, Characterization and Quantitation of Anticancer Constituent from Chloroform Extract of N. arbortristis L. Leaves
Authors: Parul Grover, K. A. Suri, Raj Kumar, Gulshan Bansal
Abstract:
Background: Nyctanthes arbortristis Linn is traditionally used as anticancer herb in Indian system of medicine, but its introduction into modern system of medicine is still awaited due to lack of systematic scientific studies. Objective: The objective of the present study was to isolate and characterize anticancer phytoconstituents from N. arbortristis L. leaves based on bioactivity guided fractionation. Method: Different extracts of the leaves of the plant were prepared by Soxhlet extractor. Each extract was evaluated for anticancer activity against HL-60 cell lines. Chloroform and HA extract showed potent anticancer activity and hence were selected for fractionation. Fraction C1 from chloroform extract was found to be most potent amongst all when tested against three cell lines (HL-60, A-549, and HCT-116) and thus was selected for further fractionation and a pure compound CP-01 was isolated. RP-HPLC method has been developed for quantification of isolated compound by using Kinetex C-18 column with gradient elution at 0.7 mL/min using mobile phase containing potassium dihydrogen phosphate (0.01 M, pH 3.0) with acetonitrile. The wavelength of maximum absorption (λₘₐₓ) selected was 210 nm. Results: The structure of potent anticancer CP-01 was determined on the basis spectroscopic methods like IR, 1H-NMR, ¹³C-NMR and Mass Spectrometry and it was characterized as 1,1,2-tris(2’,4’-di-tert-butylbenzene)-4,4-dimethyl-pent-1-ene. The content of CP-01 was found to be 0.88 %w/w of chloroform extract and 0.08 %w/w of N.arbortristis leaves. Conclusion: The study supports the traditional use of N. arbortristis as anticancer herb & the identified compound CP-01 can serve as an excellent lead to develop potent and safe anticancer drugs.Keywords: anticancer, HL-60 cell lines, Nyctanthes arbor-tristis, RP-HPLC
Procedia PDF Downloads 147