Search results for: extraction techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8150

Search results for: extraction techniques

7190 The Art and Science of Trauma-Informed Psychotherapy: Guidelines for Inter-Disciplinary Clinicians

Authors: Daphne Alroy-Thiberge

Abstract:

Trauma-impacted individuals present unique treatment challenges that include high reactivity, hyper-and hypo-arousal, poor adherence to therapy, as well as powerful transference and counter-transference experiences in therapy. This work provides an overview of the clinical tenets most often encountered in trauma-impacted individuals. Further, it provides readily applicable clinical techniques to optimize therapeutic rapport and facilitate accelerated positive mental health outcomes. Finally, integrated neuroscience and clinical evidence-based data are discussed to shed new light on crisis states in trauma-impacted individuals. This knowledge is utilized to provide effective and concrete interventions towards rapid and successful de-escalation of the impacted individual. A highly interactive, adult-learning-principles-based modality is utilized to provide an organic learning experience for participants. The information and techniques learned aim to increase clinical effectiveness, reduce staff injuries and burnout, and significantly enhance positive mental health outcomes and self-determination for the trauma-impacted individuals treated.

Keywords: clinical competencies, crisis interventions, psychotherapy techniques, trauma informed care

Procedia PDF Downloads 79
7189 Effect of Solvents in the Extraction and Stability of Anthocyanin from the Petals of Caesalpinia pulcherrima for Natural Dye-Sensitized Solar Cell

Authors: N. Prabavathy, R. Balasundaraprabhu, S. Shalini, Dhayalan Velauthapillai, S. Prasanna, N. Muthukumarasamy

Abstract:

Dye sensitized solar cell (DSSC) has become a significant research area due to their fundamental and scientific importance in the area of energy conversion. Synthetic dyes as sensitizer in DSSC are efficient and durable but they are costlier, toxic and have the tendency to degrade. Natural sensitizers contain plant pigments such as anthocyanin, carotenoid, flavonoid, and chlorophyll which promote light absorption as well as injection of charges to the conduction band of TiO2 through the sensitizer. But, the efficiency of natural dyes is not up to the mark mainly due to instability of the pigment such as anthocyanin. The stability issues in vitro are mainly due to the effect of solvents on extraction of anthocyanins and their respective pH. Taking this factor into consideration, in the present work, the anthocyanins were extracted from the flower Caesalpinia pulcherrima (C. pulcherrimma) with various solvents and their respective stability and pH values are discussed. The usage of citric acid as solvent to extract anthocyanin has shown good stability than other solvents. It also helps in enhancing the sensitization properties of anthocyanins with Titanium dioxide (TiO2) nanorods. The IPCE spectra show higher photovoltaic performance for dye sensitized TiO2nanorods using citric acid as solvent. The natural DSSC using citric acid as solvent shows a higher efficiency compared to other solvents. Hence citric acid performs to be a safe solvent for natural DSSC in boosting the photovoltaic performance and maintaining the stability of anthocyanins.

Keywords: Caesalpinia pulcherrima, citric acid, dye sensitized solar cells, TiO₂ nanorods

Procedia PDF Downloads 271
7188 A Survey of Semantic Integration Approaches in Bioinformatics

Authors: Chaimaa Messaoudi, Rachida Fissoune, Hassan Badir

Abstract:

Technological advances of computer science and data analysis are helping to provide continuously huge volumes of biological data, which are available on the web. Such advances involve and require powerful techniques for data integration to extract pertinent knowledge and information for a specific question. Biomedical exploration of these big data often requires the use of complex queries across multiple autonomous, heterogeneous and distributed data sources. Semantic integration is an active area of research in several disciplines, such as databases, information-integration, and ontology. We provide a survey of some approaches and techniques for integrating biological data, we focus on those developed in the ontology community.

Keywords: biological ontology, linked data, semantic data integration, semantic web

Procedia PDF Downloads 428
7187 Virtual 3D Environments for Image-Based Navigation Algorithms

Authors: V. B. Bastos, M. P. Lima, P. R. G. Kurka

Abstract:

This paper applies to the creation of virtual 3D environments for the study and development of mobile robot image based navigation algorithms and techniques, which need to operate robustly and efficiently. The test of these algorithms can be performed in a physical way, from conducting experiments on a prototype, or by numerical simulations. Current simulation platforms for robotic applications do not have flexible and updated models for image rendering, being unable to reproduce complex light effects and materials. Thus, it is necessary to create a test platform that integrates sophisticated simulated applications of real environments for navigation, with data and image processing. This work proposes the development of a high-level platform for building 3D model’s environments and the test of image-based navigation algorithms for mobile robots. Techniques were used for applying texture and lighting effects in order to accurately represent the generation of rendered images regarding the real world version. The application will integrate image processing scripts, trajectory control, dynamic modeling and simulation techniques for physics representation and picture rendering with the open source 3D creation suite - Blender.

Keywords: simulation, visual navigation, mobile robot, data visualization

Procedia PDF Downloads 238
7186 Synthesis and Characterization of Poly (N-(Pyridin-2-Ylmethylidene)Pyridin-2-Amine: Thermal and Conductivity Properties

Authors: Nuray Yılmaz Baran

Abstract:

The conjugated Schiff base polymers which are also called as polyazomethines are promising materials for various applications due to their good thermal resistance semiconductive, liquid crystal, fiber forming, nonlinear optical outstanding photo- and electroluminescence and antimicrobial properties. In recent years, polyazomethines have attracted intense attention of researchers especially due to optoelectronic properties which have made its usage possible in organic light emitting diodes (OLEDs), solar cells (SCs), organic field effect transistors (OFETs), and photorefractive holographic materials (PRHMs). In this study, N-(pyridin-2-ylmethylidene)pyridin-2-amine Schiff base was synthesized from condensation reaction of 2-aminopyridine with 2-pyridine carbaldehyde. Polymerization of Schiff base was achieved by polycondensation reaction using NaOCl oxidant in methanol medium at various time and temperatures. The synthesized Schiff base monomer and polymer (Poly(N-(pyridin-2-ylmethylidene)pyridin-2-amine)) was characterized by UV-vis, FT-IR, 1H-NMR, XRD techniques. Molecular weight distribution and the surface morphology of the polymer was determined by GPC and SEM-EDAX techniques. Thermal behaviour of the monomer and polymer was investigated by TG/DTG, DTA and DSC techniques.

Keywords: polyazomethines, polycondensation reaction, Schiff base polymers, thermal stability

Procedia PDF Downloads 212
7185 Evaluation of an Integrated Supersonic System for Inertial Extraction of CO₂ in Post-Combustion Streams of Fossil Fuel Operating Power Plants

Authors: Zarina Chokparova, Ighor Uzhinsky

Abstract:

Carbon dioxide emissions resulting from burning of the fossil fuels on large scales, such as oil industry or power plants, leads to a plenty of severe implications including global temperature raise, air pollution and other adverse impacts on the environment. Besides some precarious and costly ways for the alleviation of CO₂ emissions detriment in industrial scales (such as liquefaction of CO₂ and its deep-water treatment, application of adsorbents and membranes, which require careful consideration of drawback effects and their mitigation), one physically and commercially available technology for its capture and disposal is supersonic system for inertial extraction of CO₂ in after-combustion streams. Due to the flue gas with a carbon dioxide concentration of 10-15 volume percent being emitted from the combustion system, the waste stream represents a rather diluted condition at low pressure. The supersonic system induces a flue gas mixture stream to expand using a converge-and-diverge operating nozzle; the flow velocity increases to the supersonic ranges resulting in rapid drop of temperature and pressure. Thus, conversion of potential energy into the kinetic power causes a desublimation of CO₂. Solidified carbon dioxide can be sent to the separate vessel for further disposal. The major advantages of the current solution are its economic efficiency, physical stability, and compactness of the system, as well as needlessness of addition any chemical media. However, there are several challenges yet to be regarded to optimize the system: the way for increasing the size of separated CO₂ particles (as they are represented on a micrometers scale of effective diameter), reduction of the concomitant gas separated together with carbon dioxide and provision of CO₂ downstream flow purity. Moreover, determination of thermodynamic conditions of the vapor-solid mixture including specification of the valid and accurate equation of state remains to be an essential goal. Due to high speeds and temperatures reached during the process, the influence of the emitted heat should be considered, and the applicable solution model for the compressible flow need to be determined. In this report, a brief overview of the current technology status will be presented and a program for further evaluation of this approach is going to be proposed.

Keywords: CO₂ sequestration, converging diverging nozzle, fossil fuel power plant emissions, inertial CO₂ extraction, supersonic post-combustion carbon dioxide capture

Procedia PDF Downloads 129
7184 The Studies of the Sorption Capabilities of the Porous Microspheres with Lignin

Authors: M. Goliszek, M. Sobiesiak, O. Sevastyanova, B. Podkoscielna

Abstract:

Lignin is one of three main constituents of biomass together with cellulose and hemicellulose. It is a complex biopolymer, which contains a large number of functional groups, including aliphatic and aromatic hydroxyl groups, carbohylic groups and methoxy groups in its structure, that is why it shows potential capacities for process of sorption. Lignin is a highly cross-linked polymer with a three-dimentional structure which can provide large surface area and pore volumes. It can also posses better dispersion, diffusion and mass transfer behavior in a field of the removal of, e.g., heavy-metal-ions or aromatic pollutions. In this work emulsion-suspension copolymerization method, to synthesize the porous microspheres of divinylbenzene (DVB), styrene (St) and lignin was used. There are also microspheres without the addition of lignin for comparison. Before the copolymerization, modification lignin with methacryloyl chloride, to improve its reactivity with other monomers was done. The physico-chemical properties of the obtained microspheres, e.g., pore structures (adsorption-desorption measurements), thermal properties (DSC), tendencies to swell and the actual shapes were also studied. Due to well-developed porous structure and the presence of functional groups our materials may have great potential in sorption processes. To estimate the sorption capabilities of the microspheres towards phenol and its chlorinated derivatives the off-line SPE (solid-phase extraction) method is going to be applied. This method has various advantages, including low-cost, easy to use and enables the rapid measurements for a large number of chemicals. The efficiency of the materials in removing phenols from aqueous solution and in desorption processes will be evaluated.

Keywords: microspheres, lignin, sorption, solid-phase extraction

Procedia PDF Downloads 172
7183 Comparison of EMG Normalization Techniques Recommended for Back Muscles Used in Ergonomics Research

Authors: Saif Al-Qaisi, Alif Saba

Abstract:

Normalization of electromyography (EMG) data in ergonomics research is a prerequisite for interpreting the data. Normalizing accounts for variability in the data due to differences in participants’ physical characteristics, electrode placement protocols, time of day, and other nuisance factors. Typically, normalized data is reported as a percentage of the muscle’s isometric maximum voluntary contraction (%MVC). Various MVC techniques have been recommended in the literature for normalizing EMG activity of back muscles. This research tests and compares the recommended MVC techniques in the literature for three back muscles commonly used in ergonomics research, which are the lumbar erector spinae (LES), latissimus dorsi (LD), and thoracic erector spinae (TES). Six healthy males from a university population participated in this research. Five different MVC exercises were compared for each muscle using the Tringo wireless EMG system (Delsys Inc.). Since the LES and TES share similar functions in controlling trunk movements, their MVC exercises were the same, which included trunk extension at -60°, trunk extension at 0°, trunk extension while standing, hip extension, and the arch test. The MVC exercises identified in the literature for the LD were chest-supported shoulder extension, prone shoulder extension, lat-pull down, internal shoulder rotation, and abducted shoulder flexion. The maximum EMG signal was recorded during each MVC trial, and then the averages were computed across participants. A one-way analysis of variance (ANOVA) was utilized to determine the effect of MVC technique on muscle activity. Post-hoc analyses were performed using the Tukey test. The MVC technique effect was statistically significant for each of the muscles (p < 0.05); however, a larger sample of participants was needed to detect significant differences in the Tukey tests. The arch test was associated with the highest EMG average at the LES, and also it resulted in the maximum EMG activity more often than the other techniques (three out of six participants). For the TES, trunk extension at 0° was associated with the largest EMG average, and it resulted in the maximum EMG activity the most often (three out of six participants). For the LD, participants obtained their maximum EMG either from chest-supported shoulder extension (three out of six participants) or prone shoulder extension (three out of six participants). Chest-supported shoulder extension, however, had a larger average than prone shoulder extension (0.263 and 0.240, respectively). Although all the aforementioned techniques were superior in their averages, they did not always result in the maximum EMG activity. If an accurate estimate of the true MVC is desired, more than one technique may have to be performed. This research provides additional MVC techniques for each muscle that may elicit the maximum EMG activity.

Keywords: electromyography, maximum voluntary contraction, normalization, physical ergonomics

Procedia PDF Downloads 180
7182 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data

Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis

Abstract:

Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.

Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction

Procedia PDF Downloads 571
7181 Device for Reversible Hydrogen Isotope Storage with Aluminum Oxide Ceramic Case

Authors: Igor P. Maximkin, Arkady A. Yukhimchuk, Victor V. Baluev, Igor L. Malkov, Rafael K. Musyaev, Damir T. Sitdikov, Alexey V. Buchirin, Vasily V. Tikhonov

Abstract:

Minimization of tritium diffusion leakage when developing devices handling tritium-containing media is key problems whose solution will at least allow essential enhancement of radiation safety and minimization of diffusion losses of expensive tritium. One of the ways to solve this problem is to use Al₂O₃ high-strength non-porous ceramics as a structural material of the bed body. This alumina ceramics offers high strength characteristics, but its main advantages are low hydrogen permeability (as against the used structural material) and high dielectric properties. The latter enables direct induction heating of an hydride-forming metal without essential heating of the pressure and containment vessel. The use of alumina ceramics and induction heating allows: - essential reduction of tritium extraction time; - several orders reduction of tritium diffusion leakage; - more complete extraction of tritium from metal hydrides due to its higher heating up to melting in the event of final disposal of the device. The paper presents computational and experimental results for the tritium bed designed to absorb 6 liters of tritium. Titanium was used as hydrogen isotope sorbent. Results of hydrogen realize kinetic from hydride-forming metal, strength and cyclic service life tests are reported. Recommendations are also provided for the practical use of the given bed type.

Keywords: aluminum oxide ceramic, hydrogen pressure, hydrogen isotope storage, titanium hydride

Procedia PDF Downloads 386
7180 A Review on Parametric Optimization of Casting Processes Using Optimization Techniques

Authors: Bhrugesh Radadiya, Jaydeep Shah

Abstract:

In Indian foundry industry, there is a need of defect free casting with minimum production cost in short lead time. Casting defect is a very large issue in foundry shop which increases the rejection rate of casting and wastage of materials. The various parameters influences on casting process such as mold machine related parameters, green sand related parameters, cast metal related parameters, mold related parameters and shake out related parameters. The mold related parameters are most influences on casting defects in sand casting process. This paper review the casting produced by foundry with shrinkage and blow holes as a major defects was analyzed and identified that mold related parameters such as mold temperature, pouring temperature and runner size were not properly set in sand casting process. These parameters were optimized using different optimization techniques such as Taguchi method, Response surface methodology, Genetic algorithm and Teaching-learning based optimization algorithm. Finally, concluded that a Teaching-learning based optimization algorithm give better result than other optimization techniques.

Keywords: casting defects, genetic algorithm, parametric optimization, Taguchi method, TLBO algorithm

Procedia PDF Downloads 712
7179 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation

Authors: A. Bensaid, T. Mostephaoui, R. Nedjai

Abstract:

A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.

Keywords: land development, GIS, segmentation, remote sensing

Procedia PDF Downloads 133
7178 AS-Geo: Arbitrary-Sized Image Geolocalization with Learnable Geometric Enhancement Resizer

Authors: Huayuan Lu, Chunfang Yang, Ma Zhu, Baojun Qi, Yaqiong Qiao, Jiangqian Xu

Abstract:

Image geolocalization has great application prospects in fields such as autonomous driving and virtual/augmented reality. In practical application scenarios, the size of the image to be located is not fixed; it is impractical to train different networks for all possible sizes. When its size does not match the size of the input of the descriptor extraction model, existing image geolocalization methods usually directly scale or crop the image in some common ways. This will result in the loss of some information important to the geolocalization task, thus affecting the performance of the image geolocalization method. For example, excessive down-sampling can lead to blurred building contour, and inappropriate cropping can lead to the loss of key semantic elements, resulting in incorrect geolocation results. To address this problem, this paper designs a learnable image resizer and proposes an arbitrary-sized image geolocation method. (1) The designed learnable image resizer employs the self-attention mechanism to enhance the geometric features of the resized image. Firstly, it applies bilinear interpolation to the input image and its feature maps to obtain the initial resized image and the resized feature maps. Then, SKNet (selective kernel net) is used to approximate the best receptive field, thus keeping the geometric shapes as the original image. And SENet (squeeze and extraction net) is used to automatically select the feature maps with strong contour information, enhancing the geometric features. Finally, the enhanced geometric features are fused with the initial resized image, to obtain the final resized images. (2) The proposed image geolocalization method embeds the above image resizer as a fronting layer of the descriptor extraction network. It not only enables the network to be compatible with arbitrary-sized input images but also enhances the geometric features that are crucial to the image geolocalization task. Moreover, the triplet attention mechanism is added after the first convolutional layer of the backbone network to optimize the utilization of geometric elements extracted by the first convolutional layer. Finally, the local features extracted by the backbone network are aggregated to form image descriptors for image geolocalization. The proposed method was evaluated on several mainstream datasets, such as Pittsburgh30K, Tokyo24/7, and Places365. The results show that the proposed method has excellent size compatibility and compares favorably to recently mainstream geolocalization methods.

Keywords: image geolocalization, self-attention mechanism, image resizer, geometric feature

Procedia PDF Downloads 195
7177 Impacts of Climate Change and Natural Gas Operations on the Hydrology of Northeastern BC, Canada: Quantifying the Water Budget for Coles Lake

Authors: Sina Abadzadesahraei, Stephen Déry, John Rex

Abstract:

Climate research has repeatedly identified strong associations between anthropogenic emissions of ‘greenhouses gases’ and observed increases of global mean surface air temperature over the past century. Studies have also demonstrated that the degree of warming varies regionally. Canada is not exempt from this situation, and evidence is mounting that climate change is beginning to cause diverse impacts in both environmental and socio-economic spheres of interest. For example, northeastern British Columbia (BC), whose climate is controlled by a combination of maritime, continental and arctic influences, is warming at a greater rate than the remainder of the province. There are indications that these changing conditions are already leading to shifting patterns in the region’s hydrological cycle, and thus its available water resources. Coincident with these changes, northeastern BC is undergoing rapid development for oil and gas extraction: This depends largely on subsurface hydraulic fracturing (‘fracking’), which uses enormous amounts of freshwater. While this industrial activity has made substantial contributions to regional and provincial economies, it is important to ensure that sufficient and sustainable water supplies are available for all those dependent on the resource, including ecological systems. In this turn demands a comprehensive understanding of how water in all its forms interacts with landscapes, the atmosphere, and of the potential impacts of changing climatic conditions on these processes. The aim of this study is therefore to characterize and quantify all components of the water budget in the small watershed of Coles Lake (141.8 km², 100 km north of Fort Nelson, BC), through a combination of field observations and numerical modelling. Baseline information will aid the assessment of the sustainability of current and future plans for freshwater extraction by the oil and gas industry, and will help to maintain the precarious balance between economic and environmental well-being. This project is a perfect example of interdisciplinary research, in that it not only examines the hydrology of the region but also investigates how natural gas operations and growth can affect water resources. Therefore, a fruitful collaboration between academia, government and industry has been established to fulfill the objectives of this research in a meaningful manner. This project aims to provide numerous benefits to BC communities. Further, the outcome and detailed information of this research can be a huge asset to researchers examining the effect of climate change on water resources worldwide.

Keywords: northeastern British Columbia, water resources, climate change, oil and gas extraction

Procedia PDF Downloads 243
7176 Characterisation of Fractions Extracted from Sorghum Byproducts

Authors: Prima Luna, Afroditi Chatzifragkou, Dimitris Charalampopoulos

Abstract:

Sorghum byproducts, namely bran, stalk, and panicle are examples of lignocellulosic biomass. These raw materials contain large amounts of polysaccharides, in particular hemicelluloses, celluloses, and lignins, which if efficiently extracted, can be utilised for the development of a range of added value products with potential applications in agriculture and food packaging sectors. The aim of this study was to characterise fractions extracted from sorghum bran and stalk with regards to their physicochemical properties that could determine their applicability as food-packaging materials. A sequential alkaline extraction was applied for the isolation of cellulosic, hemicellulosic and lignin fractions from sorghum stalk and bran. Lignin content, phenolic content and antioxidant capacity were also investigated in the case of the lignin fraction. Thermal analysis using differential scanning calorimetry (DSC) and X-Ray Diffraction (XRD) revealed that the glass transition temperature (Tg) of cellulose fraction of the stalk was ~78.33 oC at amorphous state (~65%) and water content of ~5%. In terms of hemicellulose, the Tg value of stalk was slightly lower compared to bran at amorphous state (~54%) and had less water content (~2%). It is evident that hemicelluloses generally showed a lower thermal stability compared to cellulose, probably due to their lack of crystallinity. Additionally, bran had higher arabinose-to-xylose ratio (0.82) than the stalk, a fact that indicated its low crystallinity. Furthermore, lignin fraction had Tg value of ~93 oC at amorphous state (~11%). Stalk-derived lignin fraction contained more phenolic compounds (mainly consisting of p-coumaric and ferulic acid) and had higher lignin content and antioxidant capacity compared to bran-derived lignin fraction.

Keywords: alkaline extraction, bran, cellulose, hemicellulose, lignin, stalk

Procedia PDF Downloads 280
7175 Modeling of Daily Global Solar Radiation Using Ann Techniques: A Case of Study

Authors: Said Benkaciali, Mourad Haddadi, Abdallah Khellaf, Kacem Gairaa, Mawloud Guermoui

Abstract:

In this study, many experiments were carried out to assess the influence of the input parameters on the performance of multilayer perceptron which is one the configuration of the artificial neural networks. To estimate the daily global solar radiation on the horizontal surface, we have developed some models by using seven combinations of twelve meteorological and geographical input parameters collected from a radiometric station installed at Ghardaïa city (southern of Algeria). For selecting of best combination which provides a good accuracy, six statistical formulas (or statistical indicators) have been evaluated, such as the root mean square errors, mean absolute errors, correlation coefficient, and determination coefficient. We noted that multilayer perceptron techniques have the best performance, except when the sunshine duration parameter is not included in the input variables. The maximum of determination coefficient and correlation coefficient are equal to 98.20 and 99.11%. On the other hand, some empirical models were developed to compare their performances with those of multilayer perceptron neural networks. Results obtained show that the neural networks techniques give the best performance compared to the empirical models.

Keywords: empirical models, multilayer perceptron neural network, solar radiation, statistical formulas

Procedia PDF Downloads 325
7174 Quartz Crystal Microbalance Based Hydrophobic Nanosensor for Lysozyme Detection

Authors: F. Yılmaz, Y. Saylan, A. Derazshamshir, S. Atay, A. Denizli

Abstract:

Quartz crystal microbalance (QCM), high-resolution mass-sensing technique, measures changes in mass on oscillating quartz crystal surface by measuring changes in oscillation frequency of crystal in real time. Protein adsorption techniques via hydrophobic interaction between protein and solid support, called hydrophobic interaction chromatography (HIC), can be favorable in many cases. Some nanoparticles can be effectively applied for HIC. HIC takes advantage of the hydrophobicity of proteins by promoting its separation on the basis of hydrophobic interactions between immobilized hydrophobic ligands and nonpolar regions on the surface of the proteins. Lysozyme is found in a variety of vertebrate cells and secretions, such as spleen, milk, tears, and egg white. Its common applications are as a cell-disrupting agent for extraction of bacterial intracellular products, as an antibacterial agent in ophthalmologic preparations, as a food additive in milk products and as a drug for treatment of ulcers and infections. Lysozyme has also been used in cancer chemotherapy. The aim of this study is the synthesis of hydrophobic nanoparticles for Lysozyme detection. For this purpose, methacryoyl-L-phenylalanine was chosen as a hydrophobic matrix. The hydrophobic nanoparticles were synthesized by micro-emulsion polymerization method. Then, hydrophobic QCM nanosensor was characterized by Attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy, atomic force microscopy (AFM) and zeta size analysis. Hydrophobic QCM nanosensor was tested for real-time detection of Lysozyme from aqueous solution. The kinetic and affinity studies were determined by using Lysozyme solutions with different concentrations. The responses related to a mass (Δm) and frequency (Δf) shifts were used to evaluate adsorption properties.

Keywords: nanosensor, HIC, lysozyme, QCM

Procedia PDF Downloads 334
7173 A Review of Deep Learning Methods in Computer-Aided Detection and Diagnosis Systems based on Whole Mammogram and Ultrasound Scan Classification

Authors: Ian Omung'a

Abstract:

Breast cancer remains to be one of the deadliest cancers for women worldwide, with the risk of developing tumors being as high as 50 percent in Sub-Saharan African countries like Kenya. With as many as 42 percent of these cases set to be diagnosed late when cancer has metastasized and or the prognosis has become terminal, Full Field Digital [FFD] Mammography remains an effective screening technique that leads to early detection where in most cases, successful interventions can be made to control or eliminate the tumors altogether. FFD Mammograms have been proven to multiply more effective when used together with Computer-Aided Detection and Diagnosis [CADe] systems, relying on algorithmic implementations of Deep Learning techniques in Computer Vision to carry out deep pattern recognition that is comparable to the level of a human radiologist and decipher whether specific areas of interest in the mammogram scan image portray abnormalities if any and whether these abnormalities are indicative of a benign or malignant tumor. Within this paper, we review emergent Deep Learning techniques that will prove relevant to the development of State-of-The-Art FFD Mammogram CADe systems. These techniques will span self-supervised learning for context-encoded occlusion, self-supervised learning for pre-processing and labeling automation, as well as the creation of a standardized large-scale mammography dataset as a benchmark for CADe systems' evaluation. Finally, comparisons are drawn between existing practices that pre-date these techniques and how the development of CADe systems that incorporate them will be different.

Keywords: breast cancer diagnosis, computer aided detection and diagnosis, deep learning, whole mammogram classfication, ultrasound classification, computer vision

Procedia PDF Downloads 78
7172 Global Optimization Techniques for Optimal Placement of HF Antennas on a Shipboard

Authors: Mustafa Ural, Can Bayseferogulari

Abstract:

In this work, radio frequency (RF) coupling between two HF antennas on a shipboard platform is minimized by determining an optimal antenna placement. Unlike the other works, the coupling is minimized not only at single frequency but over the whole frequency band of operation. Similarly, GAO and PSO, are used in order to determine optimal antenna placement. Throughout this work, outputs of two optimization techniques are compared with each other in terms of antenna placements and coupling results. At the end of the work, far-field radiation pattern performances of the antennas at their optimal places are analyzed in terms of directivity and coverage in order to see that.

Keywords: electromagnetic compatibility, antenna placement, optimization, genetic algorithm optimization, particle swarm optimization

Procedia PDF Downloads 212
7171 Methodology to Affirm Driver Engagement in Dynamic Driving Task (DDT) for a Level 2 Adas Feature

Authors: Praneeth Puvvula

Abstract:

Autonomy in has become increasingly common in modern automotive cars. There are 5 levels of autonomy as defined by SAE. This paper focuses on a SAE level 2 feature which, by definition, is able to control the vehicle longitudinally and laterally at the same time. The system keeps the vehicle centred with in the lane by detecting the lane boundaries while maintaining the vehicle speed. As with the features from SAE level 1 to level 3, the primary responsibility of dynamic driving task lies with the driver. This will need monitoring techniques to ensure the driver is always engaged even while the feature is active. This paper focuses on the these techniques, which would help the safe usage of the feature and provide appropriate warnings to the driver.

Keywords: autonomous driving, safety, adas, automotive technology

Procedia PDF Downloads 70
7170 Plasma Collagen XVIII in Response to Intensive Aerobic Running and Aqueous Extraction of Black Crataegus Elbursensis in Male Rats

Authors: A. Abdi, A. Abbasi Daloee, A. Barari

Abstract:

Aim: The adaptations that occur in human body after doing exercises training are a factor to help healthy people stay away from certain diseases. One of the main adaptations is a change in blood circulation, especially in vessels. The increase of capillary density is dependent on the balance between angiogenic and angiostatic factors. Most studies show that the changes made to angiogenic developmental factors resulted from physical exercises indicate the low level of stimulators compared with inhibitors. It is believed that the plasma level of VEGF-A, the important angiogenic factor, is reduced after physical exercise. Findings indicate that the extract of crataegus plant reduces the platelet-derived growth factor receptor (PDGFR) autophosphorylation in human's fibroblast. More importantly, crataegus (1 to 100 mg in liter) clearly leads to the inhibition of PDGFR autophosphorylation in vascular smooth muscle cells (VSMCs). Angiogenesis is a process that can be classified into physiological and pathophysiological forms. collagen XVIII is a part of extracellular protein and heparan sulfate proteoglycans in vascular epithelial and endothelial basement membrane cause the release of endostatin from noncollagenous collagen XVIII. Endostatin inhibits the growth of endothelial cells, inhibits angiogenesis, weakens different types of cancer, and the growth of tumors. The purpose of the current study was to investigate the effect of intensive aerobic running with or without aqueous extraction of black Crataegus elbursensis on Collagen XVIII in male rats. Design: Thirty-two Wistar male rats (4-6 weeks old, 125-135 gr weight) were acquired from the Pasteur's Institute (Amol, Mazandaran), and randomly assigned into control (n = 16) and training (n = 16) groups. Rats were further divided into saline-control (SC) (n=8), saline-training (ST) (n=8), crataegus pentaegyna extraction -control (CPEC) (n=8), and crataegus pentaegyna extraction - training (CPET) (n=8). The control (SC and CPEC) groups remained sedentary; whereas the training groups underwent a high running exercise program. plasma were excised and immediately frozen in liquid nitrogen. Statistical analysis was performed using a one way analysis of variance and Tukey test. Significance was accepted at P = 0.05. Results: The results show that aerobic exercise group had the highest concentration collagen XVIII compared to other groups and then respectively black crataegus, training-crataegus and control groups. Conclusion: In general, researchers in this study concluded that the increase of collagen XVIII (albeit insignificant) as a result of physical activity and consumption of black crataegus extract could possibly serve as a regional inhibitor of angiogenesis and another evidence for the anti-cancer effects of physical activities. Since the research has not managed in this study to measure the amount of plasma endostatin, it is suggested that both indices are measured with important angiogenic factors so that we can have a more accurate interpretation of changes to angiogenic and angiostatic factors resulted from physical exercises.

Keywords: aerobic running, Crataegus elbursensis, Collagen XVIII

Procedia PDF Downloads 311
7169 A Comparison between Underwater Image Enhancement Techniques

Authors: Ouafa Benaida, Abdelhamid Loukil, Adda Ali Pacha

Abstract:

In recent years, the growing interest of scientists in the field of image processing and analysis of underwater images and videos has been strengthened following the emergence of new underwater exploration techniques, such as the emergence of autonomous underwater vehicles and the use of underwater image sensors facilitating the exploration of underwater mineral resources as well as the search for new species of aquatic life by biologists. Indeed, underwater images and videos have several defects and must be preprocessed before their analysis. Underwater landscapes are usually darkened due to the interaction of light with the marine environment: light is absorbed as it travels through deep waters depending on its wavelength. Additionally, light does not follow a linear direction but is scattered due to its interaction with microparticles in water, resulting in low contrast, low brightness, color distortion, and restricted visibility. The improvement of the underwater image is, therefore, more than necessary in order to facilitate its analysis. The research presented in this paper aims to implement and evaluate a set of classical techniques used in the field of improving the quality of underwater images in several color representation spaces. These methods have the particularity of being simple to implement and do not require prior knowledge of the physical model at the origin of the degradation.

Keywords: underwater image enhancement, histogram normalization, histogram equalization, contrast limited adaptive histogram equalization, single-scale retinex

Procedia PDF Downloads 69
7168 Determination of a Novel Artificial Sweetener Advantame in Food by Liquid Chromatography Tandem Mass Spectrometry

Authors: Fangyan Li, Lin Min Lee, Hui Zhu Peh, Shoet Harn Chan

Abstract:

Advantame, a derivative of aspartame, is the latest addition to a family of low caloric and high potent dipeptide sweeteners which include aspartame, neotame and alitame. The use of advantame as a high-intensity sweetener in food was first accepted by Food Standards Australia New Zealand in 2011 and subsequently by US and EU food authorities in 2014, with the results from toxicity and exposure studies showing advantame poses no safety concern to the public at regulated levels. To our knowledge, currently there is barely any detailed information on the analytical method of advantame in food matrix, except for one report published in Japanese, stating a high performance liquid chromatography (HPLC) and liquid chromatography/ mass spectrometry (LC-MS) method with a detection limit at ppm level. However, the use of acid in sample preparation and instrumental analysis in the report raised doubt over the reliability of the method, as there is indication that stability of advantame is compromised under acidic conditions. Besides, the method may not be suitable for analyzing food matrices containing advantame at low ppm or sub-ppm level. In this presentation, a simple, specific and sensitive method for the determination of advantame in food is described. The method involved extraction with water and clean-up via solid phase extraction (SPE) followed by detection using liquid chromatography tandem mass spectrometry (LC-MS/MS) in negative electrospray ionization mode. No acid was used in the entire procedure. Single laboratory validation of the method was performed in terms of linearity, precision and accuracy. A low detection limit at ppb level was achieved. Satisfactory recoveries were obtained using spiked samples at three different concentration levels. This validated method could be used in the routine inspection of the advantame level in food.

Keywords: advantame, food, LC-MS/MS, sweetener

Procedia PDF Downloads 450
7167 Feasibility of Washing/Extraction Treatment for the Remediation of Deep-Sea Mining Trailings

Authors: Kyoungrean Kim

Abstract:

Importance of deep-sea mineral resources is dramatically increasing due to the depletion of land mineral resources corresponding to increasing human’s economic activities. Korea has acquired exclusive exploration licenses at four areas which are the Clarion-Clipperton Fracture Zone in the Pacific Ocean (2002), Tonga (2008), Fiji (2011) and Indian Ocean (2014). The preparation for commercial mining of Nautilus minerals (Canada) and Lockheed martin minerals (USA) is expected by 2020. The London Protocol 1996 (LP) under International Maritime Organization (IMO) and International Seabed Authority (ISA) will set environmental guidelines for deep-sea mining until 2020, to protect marine environment. In this research, the applicability of washing/extraction treatment for the remediation of deep-sea mining tailings was mainly evaluated in order to present preliminary data to develop practical remediation technology in near future. Polymetallic nodule samples were collected at the Clarion-Clipperton Fracture Zone in the Pacific Ocean, then stored at room temperature. Samples were pulverized by using jaw crusher and ball mill then, classified into 3 particle sizes (> 63 µm, 63-20 µm, < 20 µm) by using vibratory sieve shakers (Analysette 3 Pro, Fritsch, Germany) with 63 µm and 20 µm sieve. Only the particle size 63-20 µm was used as the samples for investigation considering the lower limit of ore dressing process which is tens to 100 µm. Rhamnolipid and sodium alginate as biosurfactant and aluminum sulfate which are mainly used as flocculant were used as environmentally friendly additives. Samples were adjusted to 2% liquid with deionized water then mixed with various concentrations of additives. The mixture was stirred with a magnetic bar during specific reaction times and then the liquid phase was separated by a centrifugal separator (Thermo Fisher Scientific, USA) under 4,000 rpm for 1 h. The separated liquid was filtered with a syringe and acrylic-based filter (0.45 µm). The extracted heavy metals in the filtered liquid were then determined using a UV-Vis spectrometer (DR-5000, Hach, USA) and a heat block (DBR 200, Hach, USA) followed by US EPA methods (8506, 8009, 10217 and 10220). Polymetallic nodule was mainly composed of manganese (27%), iron (8%), nickel (1.4%), cupper (1.3 %), cobalt (1.3%) and molybdenum (0.04%). Based on remediation standards of various countries, Nickel (Ni), Copper (Cu), Cadmium (Cd) and Zinc (Zn) were selected as primary target materials. Throughout this research, the use of rhamnolipid was shown to be an effective approach for removing heavy metals in samples originated from manganese nodules. Sodium alginate might also be one of the effective additives for the remediation of deep-sea mining tailings such as polymetallic nodules. Compare to the use of rhamnolipid and sodium alginate, aluminum sulfate was more effective additive at short reaction time within 4 h. Based on these results, sequencing particle separation, selective extraction/washing, advanced filtration of liquid phase, water treatment without dewatering and solidification/stabilization may be considered as candidate technologies for the remediation of deep-sea mining tailings.

Keywords: deep-sea mining tailings, heavy metals, remediation, extraction, additives

Procedia PDF Downloads 141
7166 Optimize Data Evaluation Metrics for Fraud Detection Using Machine Learning

Authors: Jennifer Leach, Umashanger Thayasivam

Abstract:

The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, though, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate people. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease this advancement. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent data, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which testing split and technique would lead to the most optimal results.

Keywords: data science, fraud detection, machine learning, supervised learning

Procedia PDF Downloads 169
7165 Exploratory Study of the Influencing Factors for Hotels' Competitors

Authors: Asma Ameur, Dhafer Malouche

Abstract:

Hotel competitiveness research is an essential phase of the marketing strategy for any hotel. Certainly, knowing the hotels' competitors helps the hotelier to grasp its position in the market and the citizen to make the right choice in picking a hotel. Thus, competitiveness is an important indicator that can be influenced by various factors. In fact, the issue of competitiveness, this ability to cope with competition, remains a difficult and complex concept to define and to exploit. Therefore, the purpose of this article is to make an exploratory study to calculate a competitiveness indicator for hotels. Further on, this paper makes it possible to determine the criteria of direct or indirect effect on the image and the perception of a hotel. The actual research is used to look into the right model for hotel ‘competitiveness. For this reason, we exploit different theoretical contributions in the field of machine learning. Thus, we use some statistical techniques such as the Principal Component Analysis (PCA) to reduce the dimensions, as well as other techniques of statistical modeling. This paper presents a survey covering of the techniques and methods in hotel competitiveness research. Furthermore, this study allows us to deduct the significant variables that influence the determination of hotel’s competitors. Lastly, the discussed experiences in this article found that the hotel competitors are influenced by several factors with different rates.

Keywords: competitiveness, e-reputation, hotels' competitors, online hotel’ review, principal component analysis, statistical modeling

Procedia PDF Downloads 101
7164 A Study on the Effectiveness of Translanguaging in EFL Classrooms: The Case of First-year Japanese University Students

Authors: Malainine Ebnou

Abstract:

This study investigates the effectiveness of using translanguaging techniques in EFL classrooms. The interest in this topic stems from the lack of research on the effectiveness of translanguaging techniques in foreign language learning, both domestically in Japan and globally, as research has focused on translanguaging from a teaching perspective but not much on it from a learning perspective. The main question that the study departs from is whether students’ use of translanguaging techniques can produce better learning outcomes when used at the university level. The sample population of the study is first-year Japanese university students. The study takes an experimental approach where translanguaging is introduced to one group, the experimental group, and withheld from another group, the control group. Both groups will then be assessed and compared to see if the use of translanguaging has had a positive impact on learning. The impact of the research could be in three ways: challenging the prevailing argument that using learners' mother tongue in the classroom is detrimental to the learning process, challenging native speaker-centered approaches in the EFL field, and arguing that translanguaging in EFL classrooms can produce more meaningful learning outcomes. If the effectiveness of translanguaging is confirmed, it will be possible to promote the use of translanguaging in English learning at Japanese universities and contribute to the improvement of students' English, and even lay the foundations for extending the use of translanguaging to people of other ages/nationalities and other languages in the future.

Keywords: translanguaging, EFL, language learning and teaching, applied linguistics

Procedia PDF Downloads 40
7163 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM

Authors: Rajpal Kaur, Pooja Choudhary

Abstract:

Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.

Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM

Procedia PDF Downloads 365
7162 The Use of Methods and Techniques of Drama Education with Kindergarten Teachers

Authors: Vladimira Hornackova, Jana Kottasova, Zuzana Vanova, Anna Jungrova

Abstract:

Present study deals with drama education in preschool education. The research made in this field brings a qualitative comparative survey with the aim to find out the use of methods and techniques of drama education in preschool education at university or secondary school graduate preschool teachers. The research uses a content analysis and an unstandardized questionnaire for preschool teachers and obtained data are processed with the help of descriptive methods and correlations. The results allow a comparison of aspects applied through drama in preschool education. The research brings impulses for education improvement in kindergartens and inspiration for university study programs of drama education in the professional training of preschool teachers.

Keywords: drama education, preschool education, preschool teacher, research

Procedia PDF Downloads 349
7161 Efficacy of a Wiener Filter Based Technique for Speech Enhancement in Hearing Aids

Authors: Ajish K. Abraham

Abstract:

Hearing aid is the most fundamental technology employed towards rehabilitation of persons with sensory neural hearing impairment. Hearing in noise is still a matter of major concern for many hearing aid users and thus continues to be a challenging issue for the hearing aid designers. Several techniques are being currently used to enhance the speech at the hearing aid output. Most of these techniques, when implemented, result in reduction of intelligibility of the speech signal. Thus the dissatisfaction of the hearing aid user towards comprehending the desired speech amidst noise is prevailing. Multichannel Wiener Filter is widely implemented in binaural hearing aid technology for noise reduction. In this study, Wiener filter based noise reduction approach is experimented for a single microphone based hearing aid set up. This method checks the status of the input speech signal in each frequency band and then selects the relevant noise reduction procedure. Results showed that the Wiener filter based algorithm is capable of enhancing speech even when the input acoustic signal has a very low Signal to Noise Ratio (SNR). Performance of the algorithm was compared with other similar algorithms on the basis of improvement in intelligibility and SNR of the output, at different SNR levels of the input speech. Wiener filter based algorithm provided significant improvement in SNR and intelligibility compared to other techniques.

Keywords: hearing aid output speech, noise reduction, SNR improvement, Wiener filter, speech enhancement

Procedia PDF Downloads 233