Search results for: topic extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3266

Search results for: topic extraction

2666 Olive Leaf Extract as Natural Corrosion Inhibitor for Pure Copper in 0.5 M NaCl Solution: A Study by Voltammetry around OCP

Authors: Chahla Rahal, Philippe Refait

Abstract:

Oleuropein-rich extract from olive leaf and acid hydrolysates, rich in hydroxytyrosol and elenolic acid was prepared under different experimental conditions. These phenolic compounds may be used as a corrosion inhibitor. The inhibitive action of these extracts and its major constituents on the corrosion of copper in 0.5 M NaCl solution has been evaluated by potentiodynamic polarization, electrochemical impedance spectroscopy (EIS) and weight loss measurements. The product of extraction was analyzed with high performance liquid chromatography (HPLC), whose analysis shows that olive leaf extract are greatly rich in phenolic compounds, mainly Oleuropeine (OLE), Hydroxytyrosol (HT) and elenolic acid (EA). After the acid hydrolysis and high temperature of extraction, an increase in hydroxytyrosol concentration was detected, coupled with relatively low oleuropeine content and high concentration of elenolic acid. The potentiodynamic measurements have shown that this extract acts as a mixed-type corrosion inhibitor, and good inhibition efficiency is observed with the increase in HT and EA concentration. These results suggest that the inhibitive effect of olive leaf extract might be due to the adsorption of the various phenolic compounds onto the copper surface.

Keywords: Olive leaf extract, Oleuropein, hydroxytyrosol, elenolic acid , Copper, Corrosion, HPLC/DAD, Polarisation, EIS

Procedia PDF Downloads 241
2665 Hydrogeological Study of Shallow and Deep Aquifers in Balaju-Boratar Area, Kathmandu, Central Nepal

Authors: Hitendra Raj Joshi, Bipin Lamichhane

Abstract:

Groundwater is the main source of water for the industries of Balaju Industrial District (BID) and the denizens of Balaju-Boratar area. The quantity of groundwater is in a fatal condition in the area than earlier days. Water levels in shallow wells have highly lowered and deep wells are not providing an adequate amount of water as before because of higher extraction rate than the recharge rate. The main recharge zone of the shallow aquifer lies at the foot of Nagarjuna mountain, where recent colluvial debris are accumulated. Urbanization in the area is the main reason for decreasing water table. Recharge source for the deep aquifer in the region is aquiclude leakage. Sand layer above the Kalimati clay is the shallow aquifer zone, which is limited only in Balaju and eastern part of the Boratar, while the layer below the Kalimati clay spreading around Gongabu, Machhapohari, and Balaju area is considered as a potential area of deep aquifer. Over extraction of groundwater without considering water balance in the aquifers may dry out the source and can initiate the land subsidence problem. Hence, all the responsible of the industries in BID area and the denizens of Balaju-Boratar area should be encouraged to practice artificial groundwater recharge.

Keywords: aquiclude leakage, Kalimati clay, groundwater recharge

Procedia PDF Downloads 476
2664 Comparative Analysis of Costs and Well Drilling Techniques for Water, Geothermal Energy, Oil and Gas Production

Authors: Thales Maluf, Nazem Nascimento

Abstract:

The development of society relies heavily on the total amount of energy obtained and its consumption. Over the years, there has been an advancement on energy attainment, which is directly related to some natural resources and developing systems. Some of these resources should be highlighted for its remarkable presence in world´s energy grid, such as water, petroleum, and gas, while others deserve attention for representing an alternative to diversify the energy grid, like geothermal sources. Therefore, because all these resources can be extracted from the underground, drilling wells is a mandatory activity in terms of exploration, and it involves a previous geological study and an adequate preparation. It also involves a cleaning process and an extraction process that can be executed by different procedures. For that reason, this research aims the enhancement of exploration processes through a comparative analysis of drilling costs and techniques used to produce them. The analysis itself is based on a bibliographical review based on books, scientific papers, schoolwork and mainly explore drilling methods and technologies, equipment used, well measurements, extraction methods, and production costs. Besides techniques and costs regarding the drilling processes, some properties and general characteristics of these sources are also compared. Preliminary studies show that there are some major differences regarding the exploration processes, mostly because these resources are naturally distinct. Water wells, for instance, have hundreds of meters of length because water is stored close to the surface, while oil, gas, and geothermal production wells can reach thousands of meters, which make them more expensive to be drilled. The drilling methods present some general similarities especially regarding the main mechanism of perforation, but since water is a resource stored closer to the surface than the other ones, there is a wider variety of methods. Water wells can be drilled by rotary mechanisms, percussion mechanisms, rotary-percussion mechanisms, and some other simpler methods. Oil and gas production wells, on the other hand, require rotary or rotary-percussion drilling with a proper structure called drill rig and resistant materials for the drill bits and the other components, mostly because they´re stored in sedimentary basins that can be located thousands of meters under the ground. Geothermal production wells also require rotary or rotary-percussion drilling and require the existence of an injection well and an extraction well. The exploration efficiency also depends on the permeability of the soil, and that is why it has been developed the Enhanced Geothermal Systems (EGS). Throughout this review study, it can be verified that the analysis of the extraction processes of energy resources is essential since these resources are responsible for society development. Furthermore, the comparative analysis of costs and well drilling techniques for water, geothermal energy, oil, and gas production, which is the main goal of this research, can enable the growth of energy generation field through the emergence of ideas that improve the efficiency of energy generation processes.

Keywords: drilling, water, oil, Gas, geothermal energy

Procedia PDF Downloads 128
2663 Extraction and Antibacterial Studies of Oil from Three Mango Kernel Obtained from Makurdi, Nigeria

Authors: K. Asemave, D. O. Abakpa, T. T. Ligom

Abstract:

The ability of bacteria to develop resistance to many antibiotics cannot be undermined, given the multifaceted health challenges in the present times. For this reason, a lot of attention is on botanicals and their products in search of new antibacterial agents. On the other hand, mango kernel oils (MKO) can be heavily valorized by taking advantage of the myriads bioactive phytochemicals it contains. Herein, we validated the use of MKO as bioactive agent against bacteria. The MKOs for the study were extracted by soxhlet means with ethanol and hexane for 4 h from 3 different mango kernels, namely; 'local' (sample A), 'julie' (sample B), and 'john' (sample C). Prior to the extraction, ground fine particles of the kernels were obtained from the seed kernels dried in oven at 100 °C for 8 h. Hexane gave higher yield of the oils than ethanol. It was also qualitatively confirmed that the mango kernel oils contain some phytochemicals such as phenol, quinone, saponin, and terpenoid. The results of the antibacterial activities of the MKO against both gram positive (Staphylococcus aureus) and gram negative (Pseudomonas aeruginosa) at different concentrations showed that the oils extracted with ethanol gave better antibacterial properties than those of the hexane. More so, the bioactivities were best with the local mango kernel oil. Indeed this work has completely validated the previous claim that MKOs are effective antibacterial agents. Thus, these oils (especially the ethanol-derived ones) can be used as bacteriostatic and antibacterial agents in say food, cosmetics, and allied industries.

Keywords: bacteria, mango, kernel, oil, phytochemicals

Procedia PDF Downloads 133
2662 Extracting Therapeutic Grade Essential Oils from the Lamiaceae Plant Family in the United Arab Emirates (UAE): Highlights on Great Possibilities and Sever Difficulties

Authors: Suzan M. Shahin, Mohammed A. Salem

Abstract:

Essential oils are expensive phytochemicals produced and extracted from specific species belonging to particular families in the plant kingdom. In the United Arab Emirates country (UAE), which is located in the arid region of the world, nine species, from the Lamiaceae family, having the capability to produce therapeutic grade essential oils. These species include; Mentha spicata, Ocimum forskolei, Salvia macrosiphon, Salvia aegyptiaca, Salvia macilenta, Salvia spinosa, Teucrium polium, Teucrium stocksianum, and Zataria multiflora. Although, such potential species are indigenous to the UAE, however, there are almost no studies available to investigate the chemical composition and the quality of the extracted essential oils under the UAE climatological conditions. Therefore, great attention has to be given to such valuable natural resources, through conducting highly supported research projects, tailored to the UAE conditions, and investigating different extraction techniques, including the application of the latest available technologies, such as superficial fluid CO2. This is crucially needed; in order to accomplish the greatest possibilities in the medicinal field, specifically in the discovery of new therapeutic chemotypes, as well as, to achieve the sustainability of this natural resource in the country.

Keywords: essential oils, extraction techniques, Lamiaceae, traditional medicine, United Arab Emirates (UAE)

Procedia PDF Downloads 447
2661 Is there Anything Useful in That? High Value Product Extraction from Artemisia annua L. in the Spent Leaf and Waste Streams

Authors: Anike Akinrinlade

Abstract:

The world population is estimated to grow from 7.1 billion to 9.22 billion by 2075, increasing therefore by 23% from the current global population. Much of the demographic changes up to 2075 will take place in the less developed regions. There are currently 54 countries which fall under the bracket of being defined as having ‘low-middle income’ economies and need new ways to generate valuable products from current resources that is available. Artemisia annua L is well used for the extraction of the phytochemical artemisinin, which accounts for around 0.01 to 1.4 % dry weight of the plant. Artemisinin is used in the treatment of malaria, a disease rampart in sub-Saharan Africa and in many other countries. Once artemisinin has been extracted the spent leaf and waste streams are disposed of as waste. A feasibility study was carried out looking at increasing the biomass value of A. annua, by designing a biorefinery where spent leaf and waste streams are utilized for high product generation. Quercetin, ferulic acid, dihydroartemisinic acid, artemisinic acid and artemsinin were screened for in the waste stream samples and the spent leaf. The analytical results showed that artemisinin, artemisinic acid and dihydroartemisinic acid were present in the waste extracts as well as camphor and arteannuin b. Ongoing effects are looking at using more industrially relevant solvents to extract the phytochemicals from the waste fractions and investigate how microwave pyrolysis of spent leaf can be utilized to generate bio-products.

Keywords: high value product generation, bioinformatics, biomedicine, waste streams, spent leaf

Procedia PDF Downloads 327
2660 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 128
2659 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model

Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou

Abstract:

The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.

Keywords: insurance, data science, modeling, monitoring, regulation, processes

Procedia PDF Downloads 65
2658 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining

Authors: Mohsen Farhadloo, Majid Farhadloo

Abstract:

Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.

Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis

Procedia PDF Downloads 81
2657 Recovery of Copper and Gold by Delamination of Printed Circuit Boards Followed by Leaching and Solvent Extraction Process

Authors: Kamalesh Kumar Singh

Abstract:

Due to increasing trends of electronic waste, specially the ICT related gadgets, their green recycling is still a greater challenge. This article presents a two-stage, eco-friendly hydrometallurgical route for the recovery of gold from the delaminated metallic layers of waste mobile phone Printed Circuit Boards (PCBs). Initially, mobile phone PCBs are downsized (1x1 cm²) and treated with an organic solvent dimethylacetamide (DMA) for the separation of metallic fraction from non-metallic glass fiber. In the first stage, liberated metallic sheets are used for the selective dissolution of copper in an aqueous leaching reagent. Influence of various parameters such as type of leaching reagent, the concentration of the solution, temperature, time and pulp density are optimized for the effective leaching (almost 100%) of copper. Results have shown that 3M nitric acid is a suitable reagent for copper leaching at room temperature and considering chemical features, gold remained in solid residue. In the second stage, the separated residue is used for the recovery of gold by using sulphuric acid with a combination of halide salt. In this halide leaching, Cl₂ or Br₂ is generated as an in-situ oxidant to improve the leaching of gold. Results have shown that almost 92 % of gold is recovered at the optimized parameters.

Keywords: printed circuit boards, delamination, leaching, solvent extraction, recovery

Procedia PDF Downloads 36
2656 Feasibility of Chicken Feather Waste as a Renewable Resource for Textile Dyeing Processes

Authors: Belayihun Missaw

Abstract:

Cotton cationization is an emerging area that solves the environmental problems associated with the reactive dyeing of cotton. In this study, keratin hydrolysate cationizing agent from chicken feather was extracted and optimized to eliminate the usage of salt during dyeing. Cationization of cotton using the extracted keratin hydrolysate and dyeing of the cationized cotton without salt was made. The effect of extraction parametric conditions like concentration of caustic soda, temperature and time were studied on the yield of protein from chicken feather and colour strength (K/S) values, and these process conditions were optimized. The optimum extraction conditions were. 25g/l caustic soda, at 500C temperature and 105 minutes with average yield = 91.2% and 4.32 colour strength value. The effect of salt addition, pH and concentration of cationizing agent on yield colour strength was also studied and optimized. It was observed that slightly acidic condition with 4% (% owf) concentration of cationizing agent gives a better dyeability as compared to normal cotton reactive dyeing. The physical properties of cationized-dyed fabric were assessed, and the result reveals that the cationization has a similar effect as normal dyeing of cotton. The cationization of cotton with keratin extract was found to be successful and economically viable.

Keywords: cotton materials, cationization, reactive dye, keratin hydrolysate

Procedia PDF Downloads 37
2655 Enabling Community Participation for Social Innovation in the Energy Sector

Authors: Budiman Ibnu

Abstract:

This study investigates about enabling conditions to facilitate social innovation in the energy sector. This is important to support the energy transition in Indonesia. This research provides appropriate project direction, including research (and action) gaps for the energy actors in Indonesia. The actors are allowed to work further with the result of this study to stimulate the energy transition in Indonesia. This report uses systemic change framework which recognizes four drivers of systemic change in a region: 1. transforming political ecologies; 2. configuring green economies; 3. building of adaptive communities; 4. social innovation. These drivers are interconnected, and this report particularly focuses on how social innovation can be supported by other drivers. This study used methods of interview and literature review as the main sources for data collection in this report. There were interviews with eight experts in the related topic which come from different countries which have experienced social innovation in the energy sector. Afterwards, this research reviewed related journal papers from last five years, to check the latest development within the topic, to support the interview result. The result found that the enabling condition can focus on one of the drivers of systemic change, which is building communities by increasing their participation, through several integrated actions. This can be implemented in two types of citizen energy initiatives which are energy cooperatives and sustainable consumption initiatives. This implementation requires study about its related policy and governance support, in order to create complete enabling conditions to facilitate social innovation in the energy transition.

Keywords: enabling condition, social innovation, citizen initiatives, community participation

Procedia PDF Downloads 137
2654 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm

Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta

Abstract:

Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.

Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates

Procedia PDF Downloads 223
2653 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles

Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang

Abstract:

With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.

Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering

Procedia PDF Downloads 116
2652 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 287
2651 Knowledge, Attitude and Practices Regarding Advance Directives among Resident Physicians in Vicente Sotto Memorial Medical Center

Authors: Marica Pidor-Quingco, Francis Cabatingan

Abstract:

Background: One of the essential roles of a physician is to assess a patient’s worth and support them in making decisions regarding their future preferences when it comes to medical care. Advance Directives is a patient-centered approach which is liked to a better-quality treatment at the end of life. General Objective: To assess and describe the knowledge, attitudes and practices of resident physicians regarding advance directive among the resident physicians in Vicente Sotto Memorial Medical Study. Methods: An analytical cross-sectional study was conducted at Vicente Sotto Memorial Medical Center. There was a total of 129 respondents who gave their consent and was given survey questionnaire containing the demographic profile, knowledge, attitude and practices. Categorical variables were presented as frequency and percentage. Chi Square Test was used to determine the association of demographic profile with knowledge and attitude. Man-Whitney U test was utilized for the association of age with knowledge and attitude. Results: Out of 129 respondents, 36.59% were in favor towards self-determination and autonomy. Majority of the revealed an adequate knowledge and positive attitude regarding advance directives. Based on the results, there were no significant correlations between sociodemographic of the residents towards to knowledge and attitude. Over 66.7% of the respondents had used Advance Directives to their patients but 25% were not comfortable about it. Though most of the respondents was able to discuss AD with their patients, 7.0% of them are not willing to open the topic to the family. Conclusion: VSMMC is a tertiary hospital which also caters Hospice, Palliative and Supportive care to the patients. One of the services offered is initiating Advance Directives which may be a factor for a positive knowledge, attitude and practices towards this topic.

Keywords: advance directives, philippines, physicians, palliative

Procedia PDF Downloads 118
2650 Estimation of Forces Applied to Forearm Using EMG Signal Features to Control of Powered Human Arm Prostheses

Authors: Faruk Ortes, Derya Karabulut, Yunus Ziya Arslan

Abstract:

Myoelectric features gathering from musculature environment are considered on a preferential basis to perceive muscle activation and control human arm prostheses according to recent experimental researches. EMG (electromyography) signal based human arm prostheses have shown a promising performance in terms of providing basic functional requirements of motions for the amputated people in recent years. However, these assistive devices for neurorehabilitation still have important limitations in enabling amputated people to perform rather sophisticated or functional movements. Surface electromyogram (EMG) is used as the control signal to command such devices. This kind of control consists of activating a motion in prosthetic arm using muscle activation for the same particular motion. Extraction of clear and certain neural information from EMG signals plays a major role especially in fine control of hand prosthesis movements. Many signal processing methods have been utilized for feature extraction from EMG signals. The specific objective of this study was to compare widely used time domain features of EMG signal including integrated EMG(IEMG), root mean square (RMS) and waveform length(WL) for prediction of externally applied forces to human hands. Obtained features were classified using artificial neural networks (ANN) to predict the forces. EMG signals supplied to process were recorded during only type of muscle contraction which is isometric and isotonic one. Experiments were performed by three healthy subjects who are right-handed and in a range of 25-35 year-old aging. EMG signals were collected from muscles of the proximal part of the upper body consisting of: biceps brachii, triceps brachii, pectorialis major and trapezius. The force prediction results obtained from the ANN were statistically analyzed and merits and pitfalls of the extracted features were discussed with detail. The obtained results are anticipated to contribute classification process of EMG signal and motion control of powered human arm prosthetics control.

Keywords: assistive devices for neurorehabilitation, electromyography, feature extraction, force estimation, human arm prosthesis

Procedia PDF Downloads 349
2649 Introduction of Artificial Intelligence for Estimating Fractal Dimension and Its Applications in the Medical Field

Authors: Zerroug Abdelhamid, Danielle Chassoux

Abstract:

Various models are given to simulate homogeneous or heterogeneous cancerous tumors and extract in each case the boundary. The fractal dimension is then estimated by least squares method and compared to some previous methods.

Keywords: simulation, cancerous tumor, Markov fields, fractal dimension, extraction, recovering

Procedia PDF Downloads 350
2648 To Study the Effect of Drying Temperature Towards Extraction of Aquilaria subintegra Dry Leaves Using Vacuum Far Infrared

Authors: Tengku Muhammad Rafi Nazmi Bin Tengku Razali, Habsah Alwi

Abstract:

This article based on effect of temperature towards extraction of Aquilaria Subintegra. Aquilaria Subintegra which its main habitat is in Asia-tropical and particularly often found in its native which is Thailand. There is claim which is Aquilaria Subintegra contains antipyretic properties that helps fight fever. Research nowadays also shown that paracetamol consumed bring bad effect towards consumers. This sample will first dry using Vacuum Far Infrared which provides better drying than conventional oven. Soxhlet extractor used to extract oil from sample. Gas Chromatography Mass Spectrometer used to analyze sample to determine its compound. Objective from this research was to determine the active ingredients that exist in the Aquilaria Subintegra leaves and to determine whether compound of Acetaminophen exist or not inside the leaves. Moisture content from 400C was 80%, 500C was 620% and 600C was 36%. The greater temperature resulting lower moisture content inside sample leaves. 7 components were identified in sample T=400C while only 5 components were identified in sample at T=50C and T=60C. Four components were commonly identified in three sample which is 1n-Hexadecanoic acid, 9,12,15-Octadecatrienoic acid, methyl ester (z,z,z), Vitamin E and Squalene. Further studies are needed with new series of temperature to refine the best results.

Keywords: aquilaria subintegra, vacuum far infrared, SOXHLET extractor, gas chromatography mass spectrometer, paracetamol

Procedia PDF Downloads 466
2647 The Markers -mm and dämmo in Amharic: Developmental Approach

Authors: Hayat Omar

Abstract:

Languages provide speakers with a wide range of linguistic units to organize and deliver information. There are several ways to verbally express the mental representations of events. According to the linguistic tools they have acquired, speakers select the one that brings out the most communicative effect to convey their message. Our study focuses on two markers, -mm and dämmo, in Amharic (Ethiopian Semitic language). Our aim is to examine, from a developmental perspective, how they are used by speakers. We seek to distinguish the communicative and pragmatic functions indicated by means of these markers. To do so, we created a corpus of sixty narrative productions of children from 5-6, 7-8 to 10-12 years old and adult Amharic speakers. The experimental material we used to collect our data is a series of pictures without text 'Frog, Where are you?'. Although -mm and dämmo are each used in specific contexts, they are sometimes analyzed as being interchangeable. The suffix -mm is complex and multifunctional. It marks the end of the negative verbal structure, it is found in the relative structure of the imperfect, it creates new words such as adverbials or pronouns, it also serves to coordinate words, sentences and to mark the link between macro-propositions within a larger textual unit. -mm was analyzed as marker of insistence, topic shift marker, element of concatenation, contrastive focus marker, 'bisyndetic' coordinator. On the other hand, dämmo has limited function and did not attract the attention of many authors. The only approach we could find analyzes it in terms of 'monosyndetic' coordinator. The paralleling of these two elements made it possible to understand their distinctive functions and refine their description. When it comes to marking a referent, the choice of -mm or dämmo is not neutral, depending on whether the tagged argument is newly introduced, maintained, promoted or reintroduced. The presence of these morphemes explains the inter-phrastic link. The information is seized by anaphora or presupposition: -mm goes upstream while dämmo arrows downstream, the latter requires new information. The speaker uses -mm or dämmo according to what he assumes to be known to his interlocutors. The results show that -mm and dämmo, although all the speakers use them both, do not always have the same scope according to the speaker and vary according to the age. dämmo is mainly used to mark a contrastive topic to signal the concomitance of events. It is more commonly used in young children’s narratives (F(3,56) = 3,82, p < .01). Some values of -mm (additive) are acquired very early while others are rather late and increase with age (F(3,56) = 3,2, p < .03). The difficulty is due not only because of its synthetic structure but primarily because it is multi-purpose and requires a memory work. It highlights the constituent on which it operates to clarify how the message should be interpreted.

Keywords: acquisition, cohesion, connection, contrastive topic, contrastive focus, discourse marker, pragmatics

Procedia PDF Downloads 122
2646 Implications of Industry 4.0 to Supply Chain Management and Human Resources Management: The State of the Art

Authors: Ayse Begum Kilic, Sevgi Ozkan

Abstract:

Industry 4.0 (I4.0) is a significant and promising research topic that is expected to gain more importance due to its effects on important concepts like cost, resource management, and accessibility. Instead of focusing those effects in only one area, combining different departments, and see the big picture helps to make more realistic predictions about the future. The aim of this paper is to identify the implications of Industry 4.0 for both supply chain management and human resources management by finding out the topics that take place at the intersection of them. Another objective is helping the readers to realize the expected changes in these two areas due to I4.0 in order to take the necessary steps in advance and make recommendations to catch up the latest trends. The expected changes are concluded from the industry reports and related journal papers in the literature. As found in the literature, this study is the first to combine the Industry 4.0, supply chain management and human resources management and urges to lead future works by finding out the intersections of those three areas. Benefits of I4.0 and the amount, research areas and the publication years of papers on I4.0 in the academic journals are mentioned in this paper. One of the main findings of this research is that a change in the labor force qualifications is expected with the advancements in the technology. There will be a need for higher level of skills from the workers. This will directly affect the human resources management in a way of recruiting and managing those people. Another main finding is, as it is explained with an example in the article, the advancements in the technology will change the place of production. For instance, 'dark factories', a popular topic of I4.0, will enable manufacturers to produce in places that close to their marketplace. The supply chains are expected to be influenced by that change.

Keywords: human resources management, industry 4.0, logistics, supply chain management

Procedia PDF Downloads 146
2645 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 311
2644 Large-Capacity Image Information Reduction Based on Single-Cue Saliency Map for Retinal Prosthesis System

Authors: Yili Chen, Xiaokun Liang, Zhicheng Zhang, Yaoqin Xie

Abstract:

In an effort to restore visual perception in retinal diseases, an electronic retinal prosthesis with thousands of electrodes has been developed. The image processing strategies of retinal prosthesis system converts the original images from the camera to the stimulus pattern which can be interpreted by the brain. Practically, the original images are with more high resolution (256x256) than that of the stimulus pattern (such as 25x25), which causes a technical image processing challenge to do large-capacity image information reduction. In this paper, we focus on developing an efficient image processing stimulus pattern extraction algorithm by using a single cue saliency map for extracting salient objects in the image with an optimal trimming threshold. Experimental results showed that the proposed stimulus pattern extraction algorithm performs quite well for different scenes in terms of the stimulus pattern. In the algorithm performance experiment, our proposed SCSPE algorithm have almost five times of the score compared with Boyle’s algorithm. Through experiment s we suggested that when there are salient objects in the scene (such as the blind meet people or talking with people), the trimming threshold should be set around 0.4max, in other situations, the trimming threshold values can be set between 0.2max-0.4max to give the satisfied stimulus pattern.

Keywords: retinal prosthesis, image processing, region of interest, saliency map, trimming threshold selection

Procedia PDF Downloads 228
2643 Fishing Waste: A Source of Valuable Products through Anaerobic Treatments

Authors: Luisa Maria Arrechea Fajardo, Luz Stella Cadavid Rodriguez

Abstract:

Fish is one of the most commercialized foods worldwide. However, this industry only takes advantage of about 55% of the product's weight, the rest is converted into waste, which is mainly composed of viscera, gills, scales and spines. Consequently, if these wastes are not used or disposed of properly, they cause serious environmental impacts. This is the case of Tumaco (Colombia), the second largest producer of marine fisheries on the Colombian Pacific coast, where artisanal fishermen process more than 50% of the commercialized volume. There, fishing waste is disposed primarily in the ocean, causing negative impacts on the environment and society. Therefore, in the present research, a proposal was made to take advantage of fishing waste through anaerobic treatments, through which it is possible to obtain products with high added value from organic waste. The research was carried out in four stages. First, the production of volatile fatty acids (VFA) in semi-continuous 4L reactors was studied, evaluating three hydraulic retention times (HRT) (10, 7 and 5 days) with four organic loading rates (OLR) (16, 14, 12 and 10 gVS/L/day), the experiment was carried out for 150 days. Subsequently, biogas production was evaluated from the solid digestate generated in the VFA production reactors, initially evaluating the biochemical methane potential (BMP) of 4 total solid concentrations (1, 2, 4 and 6% TS), for 40 days and then, with the optimum TS concentration (2 gVS/L/day), 2 HRT (15 and 20 days) in semi-continuous reactors, were evaluated for 100 days. Finally, the integration of the processes was carried out with the best conditions found, a first phase of VFA production from fishing waste and a second phase of biogas production from unrecovered VFAs and unprocessed material Additionally, an VFA membrane extraction system was included. In the first phase, a liquid digestate with a concentration and VFA production yield of 59.04 gVFA/L and 0.527 gVFA/gVS, respectively, was obtained, with the best condition found (HRT:7 days and OLR: 16 gVS/L/día), where acetic acid and isobutyric acid were the predominant acids. In the second phase of biogas production, a BMP of 0.349 Nm3CH4/KgVS was reached, and it was found as best HRT 20 days. In the integration, the isovaleric, butyric and isobutyric acid were the VFA with the highest percentage of extraction, additionally a 106.67% increase in biogas production was achieved. This research shows that anaerobic treatments are a promising technology for an environmentally safe management of fishing waste and presents the basis of a possible biorefinery.

Keywords: biogas production, fishing waste, VFA membrane extraction, VFA production

Procedia PDF Downloads 100
2642 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 169
2641 Numerical Investigation of Nanofluid Based Thermosyphon System

Authors: Kiran Kumar K., Ramesh Babu Bejjam, Atul Najan

Abstract:

A thermosyphon system is a heat transfer loop which operates on the basis of gravity and buoyancy forces. It guarantees a good reliability and low maintenance cost as it does not involve any mechanical pump. Therefore it can be used in many industrial applications such as refrigeration and air conditioning, electronic cooling, nuclear reactors, geothermal heat extraction, etc. But flow instabilities and loop configuration are the major problems in this system. Several previous researchers studied that stabilities can be suppressed by using nanofluids as loop fluid. In the present study a rectangular thermosyphon loop with end heat exchangers are considered for the study. This configuration is more appropriate for many practical applications such as solar water heater, geothermal heat extraction, etc. In the present work, steady-state analysis is carried out on thermosyphon loop with parallel flow coaxial heat exchangers at heat source and heat sink. In this loop nano fluid is considered as the loop fluid and water is considered as the external fluid in both hot and cold heat exchangers. For this analysis one-dimensional homogeneous model is developed. In this model, conservation equations like conservation of mass, momentum, energy are discretized using finite difference method. A computer code is written in MATLAB to simulate the flow in thermosyphon loop. A comparison in terms of heat transfer is made between water and nano fluid as working fluids in the loop.

Keywords: heat exchanger, heat transfer, nanofluid, thermosyphon loop

Procedia PDF Downloads 461
2640 High Performance Liquid Cooling Garment (LCG) Using ThermoCore

Authors: Venkat Kamavaram, Ravi Pare

Abstract:

Modern warfighters experience extreme environmental conditions in many of their operational and training activities. In temperatures exceeding 95°F, the body’s temperature regulation can no longer cool through convection and radiation. In this case, the only cooling mechanism is evaporation. However, evaporative cooling is often compromised by excessive humidity. Natural cooling mechanisms can be further compromised by clothing and protective gear, which trap hot air and moisture close to the body. Creating an efficient heat extraction apparel system that is also lightweight without hindering dexterity or mobility of personnel working in extreme temperatures is a difficult technical challenge and one that needs to be addressed to increase the probability for the future success of the US military. To address this challenge, Oceanit Laboratories, Inc. has developed and patented a Liquid Cooled Garment (LCG) more effective than any on the market today. Oceanit’s LCG is a form-fitting garment with a network of thermally conductive tubes that extracts body heat and can be worn under all authorized and chemical/biological protective clothing. Oceanit specifically designed and developed ThermoCore®, a thermally conductive polymer, for use in this apparel, optimizing the product for thermal conductivity, mechanical properties, manufacturability, and performance temperatures. Thermal Manikin tests were conducted in accordance with the ASTM test method, ASTM F2371, Standard Test Method for Measuring the Heat Removal Rate of Personal Cooling Systems Using a Sweating Heated Manikin, in an environmental chamber using a 20-zone sweating thermal manikin. Manikin test results have shown that Oceanit’s LCG provides significantly higher heat extraction under the same environmental conditions than the currently fielded Environmental Control Vest (ECV) while at the same time reducing the weight. Oceanit’s LCG vests performed nearly 30% better in extracting body heat while weighing 15% less than the ECV. There are NO cooling garments in the market that provide the same thermal extraction performance, form-factor, and reduced weight as Oceanit’s LCG. The two cooling garments that are commercially available and most commonly used are the Environmental Control Vest (ECV) and the Microclimate Cooling Garment (MCG).

Keywords: thermally conductive composite, tubing, garment design, form fitting vest, thermocore

Procedia PDF Downloads 100
2639 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 107
2638 Polarization of Lithuanian Society on Issues Related to Language Politics

Authors: Eglė Žurauskaitė, Eglė Gudavičienė

Abstract:

The goal of this paper is to reveal how polarization is constructed through the use of impoliteness strategies. In general, media helps to spread various ideas very fast, and it means that processes of polarization are best revealed in computer-mediated communication (CMC) contexts. For this reason, data for the research was collected from online texts about a current, very diverse topic in Lithuania - Lithuanian language policy and regulations, because this topic is causing a lot of tension in Lithuanian society. Computer-mediated communication allows users to edit their message before they send it. It means that addressees carefully select verbal expressions to convey their message. In other words, each impoliteness strategy and its verbal expression were created intentionally. Impoliteness strategies in this research are understood as various ways to reach a communicative goal: belittle the other. To reach the goal, the public opinions of various Lithuanian public figures (e. g., cultural people, politicians, officials) were collected from new portals in 2019–2023 and analyzed using both quantitative and qualitative approaches. First, problematic aspects of the language policy, for which public figures complain, were identified. Then instances when public figures take a defensive position were analyzed: how they express this position and what it reveals about Lithuanian culture. Findings of this research demonstrate how concepts of impoliteness theory can be applied in analyzing the process of polarization in Lithuanian society on issues related to the State language policy. Also, to reveal how polarization is constructed, these tasks were set: a) determine which impoliteness strategies are used throughout the process of creating polarization, b) analyze how they were expressed verbally (e. g., as an advice, offer, etc.).

Keywords: impoliteness, Lithuanian language policy, polarization, impoliteness strategies

Procedia PDF Downloads 33
2637 Green Synthesis of Magnetic, Silica Nanocomposite and Its Adsorptive Performance against Organochlorine Pesticides

Authors: Waleed A. El-Said, Dina M. Fouad, Mohamed H. Aly, Mohamed A. El-Gahami

Abstract:

Green synthesis of nanomaterials has received increasing attention as an eco-friendly technology in materials science. Here, we have used two types of extractions from green tea leaf (i.e. total extraction and tannin extraction) as reducing agents for a rapid, simple and one step synthesis method of mesoporous silica nanoparticles (MSNPs)/iron oxide (Fe3O4) nanocomposite based on deposition of Fe3O4 onto MSNPs. MSNPs/Fe3O4 nanocomposite were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy, energy dispersive X-ray, vibrating sample magnetometer, N2 adsorption, and high-resolution transmission electron microscopy. The average mesoporous silica particle diameter was found to be around 30 nm with high surface area (818 m2/gm). MSNPs/Fe3O4 nanocomposite was used for removing lindane pesticide (an environmental hazard material) from aqueous solutions. Fourier transform infrared, UV-vis, High-performance liquid chromatography and gas chromatography techniques were used to confirm the high ability of MSNPs/Fe3O4 nanocomposite for sensing and capture of lindane molecules with high sorption capacity (more than 89%) that could develop a new eco-friendly strategy for detection and removing of pesticide and as a promising material for water treatment application.

Keywords: green synthesis, mesoporous silica, magnetic iron oxide NPs, adsorption Lindane

Procedia PDF Downloads 418