Search results for: dynamic selection
4343 Characterisation of Chitooligomers Prepared with the Aid of Cellulase, Xylanase and Chitosanase
Authors: Anna Zimoch-Korzycka, Dominika Kulig, Andrzej Jarmoluk
Abstract:
The aim of this study was to obtain chitooligosaccharides from chitosan with better functional properties using three different enzyme preparations and compare the products of enzymatic hydrolysis. Commercially available cellulase (CL), xylanase (X) and chitosanase (CS) preparations were used to investigate hydrolytic activity on chitosan (CH) with low molecular weight and DD of 75-85%. It has been reported that CL and X have side activities of other enzymes, such as β-glucanase or β-glucosidase. CS enzyme has a foreign activity of chitinase. Each preparation was used in 1000 U of activity and in the same reaction conditions. The degree of deacetylation and molecular weight of chitosan were specified using titration and viscometric methods, respectively. The hydrolytic activity of enzymes preparations on chitosan was monitored by dynamic viscosity measurement. After 4 h reaction with stirring, solutions were filtered and chitosan oligomers were isolated by methanol solution into two fractions: precipitate (A) and supernatant (B). A Fourier-transform infrared spectroscopy was used to characterize the structural changes of chitosan oligomers fractions and initial chitosan. Furthermore, the solubility of lyophilized hydrolytic mixture (C) and two chitooligomers fractions (A, B) of each enzyme hydrolysis was assayed. The antioxidant activity of chitosan oligomers was evaluated as DPPH free radical scavenging activity. The dynamic viscosity measured after addition of enzymes preparation to the chitosan solution decreased dramatically over time in the sample with X in comparison to solution without the enzyme. For mixtures with CL and CS, lower viscosities were also recorded but not as low as the ones with X. A and B fractions were characterized by the most similar viscosity obtained by the xylanase hydrolysis and were 15 mPas and 9 mPas, respectively. Structural changes of chitosan oligomers A, B, C and their differences related with various enzyme preparations used were confirmed. Water solubility of A fractions was not possible to filter and the result was not recorded. Solubility of supernatants was approximately 95% and was higher than hydrolytic mixture. It was observed that the DPPH radical scavenging effect of A, B, C samples is the highest for X products and was approximately 13, 17, 19% respectively. In summary, a mixture of chitooligomers may be useful for the design of edible protective coatings due to the improved biophysical properties.Keywords: cellulase, xylanase, chitosanase, chitosan, chitooligosaccharides
Procedia PDF Downloads 3264342 Exploring the Differences between Self-Harming and Suicidal Behaviour in Women with Complex Mental Health Needs
Authors: Sophie Oakes-Rogers, Di Bailey, Karen Slade
Abstract:
Female offenders are a uniquely vulnerable group, who are at high risk of suicide. Whilst the prevention of self-harm and suicide remains a key global priority, we need to better understand the relationship between these challenging behaviours that constitute a pressing problem, particularly in environments designed to prioritise safety and security. Method choice is unlikely to be random, and is instead influenced by a range of cultural, social, psychological and environmental factors, which change over time and between countries. A key aspect of self-harm and suicide in women receiving forensic care is the lack of free access to methods. At a time where self-harm and suicide rates continue to rise internationally, understanding the role of these influencing factors and the impact of current suicide prevention strategies on the use of near-lethal methods is crucial. This poster presentation will present findings from 25 interviews and 3 focus groups, which enlisted a Participatory Action Research approach to explore the differences between self-harming and suicidal behavior. A key element of this research was using the lived experiences of women receiving forensic care from one forensic pathway in the UK, and the staffs who care for them, to discuss the role of near-lethal self-harm (NLSH). The findings and suggestions from the lived accounts of the women and staff will inform a draft assessment tool, which better assesses the risk of suicide based on the lethality of methods. This tool will be the first of its kind, which specifically captures the needs of women receiving forensic services. Preliminary findings indicate women engage in NLSH for two key reasons and is determined by their history of self-harm. Women who have a history of superficial non-life threatening self-harm appear to engage in NLSH in response to a significant life event such as family bereavement or sentencing. For these women, suicide appears to be a realistic option to overcome their distress. This, however, differs from women who appear to have a lifetime history of NLSH, who engage in such behavior in a bid to overcome the grief and shame associated with historical abuse. NLSH in these women reflects a lifetime of suicidality and indicates they pose the greatest risk of completed suicide. Findings also indicate differences in method selection between forensic provisions. Restriction of means appears to play a role in method selection, and findings suggest it causes method substitution. Implications will be discussed relating to the screening of female forensic patients and improvements to the current suicide prevention strategies.Keywords: forensic mental health, method substitution, restriction of means, suicide
Procedia PDF Downloads 1784341 Orthogonal Regression for Nonparametric Estimation of Errors-In-Variables Models
Authors: Anastasiia Yu. Timofeeva
Abstract:
Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.Keywords: grade point average, orthogonal regression, penalized regression spline, locally weighted regression
Procedia PDF Downloads 4164340 Dynamic of an Invasive Insect Gut Microbiome When Facing to Abiotic Stress
Authors: Judith Mogouong, Philippe Constant, Robert Lavallee, Claude Guertin
Abstract:
The emerald ash borer (EAB) is an exotic wood borer insect native from China, which is associated with important environmental and economic damages in North America. Beetles are known to be vectors of microbial communities related to their adaptive capacities. It is now established that environmental stress factors may induce physiological events on the host trees, such as phytochemical changes. Consequently, that may affect the establishment comportment of herbivorous insect. Considering the number of insects collected on ash trees (insects’ density) as an abiotic factor related to stress damage, the aim of our study was to explore the dynamic of EAB gut microbial community genome (microbiome) when facing that factor and to monitor its diversity. Insects were trapped using specific green Lindgren© traps. A gradient of the captured insect population along the St. Lawrence River was used to create three levels of insects’ density (low, intermediate, and high). After dissection, total DNA extracted from insect guts of each level has been sent for amplicon sequencing of bacterial 16S rRNA gene and fungal ITS2 region. The composition of microbial communities among sample appeared largely diversified with the Simpson index significantly different across the three levels of density for bacteria. Add to that; bacteria were represented by seven phyla and twelve classes, whereas fungi were represented by two phyla and seven known classes. Using principal coordinate analysis (PCoA) based on Bray Curtis distances of 16S rRNA sequences, we observed a significant variation between the structure of the bacterial communities depending on insects’ density. Moreover, the analysis showed significant correlations between some bacterial taxa and the three classes of insects’ density. This study is the first to present a complete overview of the bacterial and fungal communities associated with the gut of EAB base on culture-independent methods, and to correlate those communities with a potential stress factor of the host trees.Keywords: gut microbiome, DNA, 16S rRNA sequences, emerald ash borer
Procedia PDF Downloads 4034339 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning
Authors: Saahith M. S., Sivakami R.
Abstract:
In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis
Procedia PDF Downloads 384338 Determination Optimum Strike Price of FX Option Call Spread with USD/IDR Volatility and Garman–Kohlhagen Model Analysis
Authors: Bangkit Adhi Nugraha, Bambang Suripto
Abstract:
On September 2016 Bank Indonesia (BI) release regulation no.18/18/PBI/2016 that permit bank clients for using the FX option call spread USD/IDR. Basically, this product is a combination between clients buy FX call option (pay premium) and sell FX call option (receive premium) to protect against currency depreciation while also capping the potential upside with cheap premium cost. BI classifies this product as a structured product. The structured product is combination at least two financial instruments, either derivative or non-derivative instruments. The call spread is the first structured product against IDR permitted by BI since 2009 as response the demand increase from Indonesia firms on FX hedging through derivative for protecting market risk their foreign currency asset or liability. The composition of hedging products on Indonesian FX market increase from 35% on 2015 to 40% on 2016, the majority on swap product (FX forward, FX swap, cross currency swap). Swap is formulated by interest rate difference of the two currency pairs. The cost of swap product is 7% for USD/IDR with one year USD/IDR volatility 13%. That cost level makes swap products seem expensive for hedging buyers. Because call spread cost (around 1.5-3%) cheaper than swap, the most Indonesian firms are using NDF FX call spread USD/IDR on offshore with outstanding amount around 10 billion USD. The cheaper cost of call spread is the main advantage for hedging buyers. The problem arises because BI regulation requires the call spread buyer doing the dynamic hedging. That means, if call spread buyer choose strike price 1 and strike price 2 and volatility USD/IDR exchange rate surpass strike price 2, then the call spread buyer must buy another call spread with strike price 1’ (strike price 1’ = strike price 2) and strike price 2’ (strike price 2’ > strike price 1‘). It could make the premium cost of call spread doubled or even more and dismiss the purpose of hedging buyer to find the cheapest hedging cost. It is very crucial for the buyer to choose best optimum strike price before entering into the transaction. To help hedging buyer find the optimum strike price and avoid expensive multiple premium cost, we observe ten years 2005-2015 historical data of USD/IDR volatility to be compared with the price movement of the call spread USD/IDR using Garman–Kohlhagen Model (as a common formula on FX option pricing). We use statistical tools to analysis data correlation, understand nature of call spread price movement over ten years, and determine factors affecting price movement. We select some range of strike price and tenor and calculate the probability of dynamic hedging to occur and how much it’s cost. We found USD/IDR currency pairs is too uncertain and make dynamic hedging riskier and more expensive. We validated this result using one year data and shown small RMS. The study result could be used to understand nature of FX call spread and determine optimum strike price for hedging plan.Keywords: FX call spread USD/IDR, USD/IDR volatility statistical analysis, Garman–Kohlhagen Model on FX Option USD/IDR, Bank Indonesia Regulation no.18/18/PBI/2016
Procedia PDF Downloads 3804337 Enhancing Single Channel Minimum Quantity Lubrication through Bypass Controlled Design for Deep Hole Drilling with Small Diameter Tool
Authors: Yongrong Li, Ralf Domroes
Abstract:
Due to significant energy savings, enablement of higher machining speed as well as environmentally friendly features, Minimum Quantity Lubrication (MQL) has been used for many machining processes efficiently. However, in the deep hole drilling field (small tool diameter D < 5 mm) and long tool (length L > 25xD) it is always a bottle neck for a single channel MQL system. The single channel MQL, based on the Venturi principle, faces a lack of enough oil quantity caused by dropped pressure difference during the deep hole drilling process. In this paper, a system concept based on a bypass design has explored its possibility to dynamically reach the required pressure difference between the air inlet and the inside of aerosol generator, so that the deep hole drilling demanded volume of oil can be generated and delivered to tool tips. The system concept has been investigated in static and dynamic laboratory testing. In the static test, the oil volume with and without bypass control were measured. This shows an oil quantity increasing potential up to 1000%. A spray pattern test has demonstrated the differences of aerosol particle size, aerosol distribution and reaction time between single channel and bypass controlled single channel MQL systems. A dynamic trial machining test of deep hole drilling (drill tool D=4.5mm, L= 40xD) has been carried out with the proposed system on a difficult machining material AlSi7Mg. The tool wear along a 100 meter drilling was tracked and analyzed. The result shows that the single channel MQL with a bypass control can overcome the limitation and enhance deep hole drilling with a small tool. The optimized combination of inlet air pressure and bypass control results in a high quality oil delivery to tool tips with a uniform and continuous aerosol flow.Keywords: deep hole drilling, green production, Minimum Quantity Lubrication (MQL), near dry machining
Procedia PDF Downloads 2054336 Design and Implementation of A 10-bit SAR ADC with A Programmable Reference
Authors: Hasmayadi Abdul Majid, Yuzman Yusoff, Noor Shelida Salleh
Abstract:
This paper presents the development of a single-ended 38.5 kS/s 10-bit programmable reference SAR ADC which is realized in MIMOS’s 0.35 µm CMOS process. The design uses a resistive DAC, a dynamic comparator with pre-amplifier and a SAR digital logic to create 10 effective bits ADC. A programmable reference circuitry allows the ADC to operate with different input range from 0.6 V to 2.1 V. A single ended 38.5 kS/s 10-bit programmable reference SAR ADC was proposed and implemented in a 0.35 µm CMOS technology and consumed less than 7.5 mW power with a 3 V supply.Keywords: successive approximation register analog-to-digital converter, SAR ADC, resistive DAC, programmable reference
Procedia PDF Downloads 5184335 Evaluation of Low-Global Warming Potential Refrigerants in Vapor Compression Heat Pumps
Authors: Hamed Jafargholi
Abstract:
Global warming presents an immense environmental risk, causing detrimental impacts on ecological systems and putting coastal areas at risk. Implementing efficient measures to minimize greenhouse gas emissions and the use of fossil fuels is essential to reducing global warming. Vapor compression heat pumps provide a practical method for harnessing energy from waste heat sources and reducing energy consumption. However, traditional working fluids used in these heat pumps generally contain a significant global warming potential (GWP), which might cause severe greenhouse effects if they are released. The goal of the emphasis on low-GWP (below 150) refrigerants is to further the vapor compression heat pumps. A classification system for vapor compression heat pumps is offered, with different boundaries based on the needed heat temperature and advancements in heat pump technology. A heat pump could be classified as a low temperature heat pump (LTHP), medium temperature heat pump (MTHP), high temperature heat pump (HTHP), or ultra-high temperature heat pump (UHTHP). The HTHP/UHTHP border is 160 °C, the MTHP/HTHP and LTHP/MTHP limits are 100 and 60 °C, respectively. The refrigerant is one of the most important parts of a vapor compression heat pump system. Presently, the main ways to choose a refrigerant are based on ozone depletion potential (ODP) and GWP, with GWP being the lowest possible value and ODP being zero. Pure low-GWP refrigerants, such as natural refrigerants (R718 and R744), hydrocarbons (R290, R600), hydrofluorocarbons (R152a and R161), hydrofluoroolefins (R1234yf, R1234ze(E)), and hydrochlorofluoroolefin (R1233zd(E)), were selected as candidates for vapor compression heat pump systems based on these selection principles. The performance, characteristics, and potential uses of these low-GWP refrigerants in heat pump systems are investigated in this paper. As vapor compression heat pumps with pure low-GWP refrigerants become more common, more and more low-grade heat can be recovered. This means that energy consumption would decrease. The research outputs showed that the refrigerants R718 for UHTHP application, R1233zd(E) for HTHP application, R600, R152a, R161, R1234ze(E) for MTHP, and R744, R290, and R1234yf for LTHP application are appropriate. The selection of an appropriate refrigerant should, in fact, take into consideration two different environmental and thermodynamic points of view. It might be argued that, depending on the situation, a trade-off between these two groups should constantly be considered. The environmental approach is now far stronger than it was previously, according to the European Union regulations. This will promote sustainable energy consumption and social development in addition to assisting in the reduction of greenhouse gas emissions and the management of global warming.Keywords: vapor compression, global warming potential, heat pumps, greenhouse
Procedia PDF Downloads 344334 Approaches of Flight Level Selection for an Unmanned Aerial Vehicle Round-Trip in Order to Reach Best Range Using Changes in Flight Level Winds
Authors: Dmitry Fedoseyev
Abstract:
The ultimate success of unmanned aerial vehicles (UAVs) depends largely on the effective control of their flight, especially in variable wind conditions. This paper investigates different approaches to selecting the optimal flight level to maximize the range of UAVs. We propose to consider methods based on mathematical models of atmospheric conditions, as well as the use of sensor data and machine learning algorithms to automatically optimize the flight level in real-time. The proposed approaches promise to improve the efficiency and range of UAVs in various wind conditions, which may have significant implications for the application of these systems in various fields, including geodesy, environmental surveillance, and search and rescue operations.Keywords: drone, UAV, flight trajectory, wind-searching, efficiency
Procedia PDF Downloads 634333 Virucidal, Bactericidal and Fungicidal Efficiency of Dry Microfine Steam on Innate Surfaces
Authors: C. Recchia, M. Bourel, B. Recchia
Abstract:
Microorganisms (viruses, bacteria, fungi) are responsible for most communicable diseases, threatening human health. For domestic use, chemical agents are often criticized because of their potential dangerousness, and natural solutions are needed. Application of the “dry microfine steam” (DMS) technology was tested on a selection of common pathogens (SARS-CoV-2, enterovirus EV-71, human coronavirus 229E, E. coli, S. aureus, C. albicans), on different innate surfaces, for 5 to 10 seconds. Quantification of the remaining pathogens was performed, and the reduction rates ranged from 99.8% (S. aureus on plastic) to over 99.999%. DMS showed high efficacy in the elimination of common microorganisms and could be seen as a natural alternative to chemical agents to improve domestic hygiene.Keywords: steam, SARS-CoV-2, bactericidal, virucidal, fungicidal, sterilization
Procedia PDF Downloads 1634332 Low-Power Digital Filters Design Using a Bypassing Technique
Authors: Thiago Brito Bezerra
Abstract:
This paper presents a novel approach to reduce power consumption of digital filters based on dynamic bypassing of partial products in their multipliers. The bypassing elements incorporated into the multiplier hardware eliminate redundant signal transitions, which appear within the carry-save adders when the partial product is zero. This technique reduces the power consumption by around 20%. The circuit implementation was made using the AMS 0.18 um technology. The bypassing technique applied to the circuits is outlined.Keywords: digital filter, low-power, bypassing technique, low-pass filter
Procedia PDF Downloads 3824331 Time Temperature Dependence of Long Fiber Reinforced Polypropylene Manufactured by Direct Long Fiber Thermoplastic Process
Authors: K. A. Weidenmann, M. Grigo, B. Brylka, P. Elsner, T. Böhlke
Abstract:
In order to reduce fuel consumption, the weight of automobiles has to be reduced. Fiber reinforced polymers offer the potential to reach this aim because of their high stiffness to weight ratio. Additionally, the use of fiber reinforced polymers in automotive applications has to allow for an economic large-scale production. In this regard, long fiber reinforced thermoplastics made by direct processing offer both mechanical performance and processability in injection moulding and compression moulding. The work presented in this contribution deals with long glass fiber reinforced polypropylene directly processed in compression moulding (D-LFT). For the use in automotive applications both the temperature and the time dependency of the materials properties have to be investigated to fulfill performance requirements during crash or the demands of service temperatures ranging from -40 °C to 80 °C. To consider both the influence of temperature and time, quasistatic tensile tests have been carried out at different temperatures. These tests have been complemented by high speed tensile tests at different strain rates. As expected, the increase in strain rate results in an increase of the elastic modulus which correlates to an increase of the stiffness with decreasing service temperature. The results are in good accordance with results determined by dynamic mechanical analysis within the range of 0.1 to 100 Hz. The experimental results from different testing methods were grouped and interpreted by using different time temperature shift approaches. In this regard, Williams-Landel-Ferry and Arrhenius approach based on kinetics have been used. As the theoretical shift factor follows an arctan function, an empirical approach was also taken into consideration. It could be shown that this approach describes best the time and temperature superposition for glass fiber reinforced polypropylene manufactured by D-LFT processing.Keywords: composite, dynamic mechanical analysis, long fibre reinforced thermoplastics, mechanical properties, time temperature superposition
Procedia PDF Downloads 1994330 In Search of Good Fortune: Individualization, Youth and the Spanish Labour Market within a Context of Crisis
Authors: Matthew Lee Turnbough
Abstract:
In 2007 Spain began to experience the effects of a deep economic crisis, which would generate a situation characterised by instability and uncertainty. This has been an obstacle, especially acute for the youth of this country seeking to enter the workforce. As a result of the impact of COVID-19, the youth in Spain are now suffering the effects of a new crisis that has deepened an already fragile labour environment. In this paper, we analyse the discourses that have emerged from a precarious labour market, specifically from two companies dedicated to operating job portals and job listings in Spain, Job Today, and CornerJob. These two start-up businesses have developed mobile applications geared towards young adults in search of employment in the service sector, two of the companies with the highest user rates in Spain. Utilizing a discourse analysis approach, we explore the impact of individualization and how the process of psychologization may contribute to an increasing reliance on individual solutions to social problems. As such, we seek to highlight the expectations and demands that are placed upon young workers and the type of subjectivity that this dynamic could foster, all this within an unstable framework seemingly marked by chance, a context which is key for the emergence of individualization. Furthermore, we consider the extent to which young adults incorporate these discourses and the strategies they employ basing our analysis on the VULSOCU (New Forms of Socio-Existential Vulnerability, Supports, and Care in Spain) research project, specifically the results of nineteen in-depth interviews and three discussion groups with young adults in this country. Consequently, we seek to elucidate the argumentative threads rooted in the process of individualization and underline the implications of this dynamic for the young worker and his/her labour insertion while also identifying manifestations of the goddess of fortune as a representation of chance in this context. Finally, we approach this panorama of social change in Spain from the perspective of the individuals or young adults who find themselves immersed in this transition from one crisis to another.Keywords: chance, crisis, discourses, individualization, work, youth
Procedia PDF Downloads 1174329 Selection of Endophytcs Fungi Isolated from Date Palm, Halotolerants and Productors of Secondary Metabolite
Authors: Fadila Mohamed Mahmoud., Derkaoui I., Krimi Z.
Abstract:
Date palm is a plant which presents a very good adaptation to the difficult conditions of the environment in particular to the drought and saline stress even at high temperatures. This adaptation is related on the biology of the plant and to the presence of a microflora endophyte which live inside its tissues. Fifteen endophytics fungi isolated from date palm were tested in vitro in the presence of various NaCl concentrations to select halotolerantes isolates. These same endophytes were tested for their colonizing capacity by the description of the production of secondary metabolites more particularly the enzymes (pectinases, proteases, and phosphorylases), and the production of antibiotics and growth hormones. Significant difference was observed between the isolates with respect to the tests carried out.Keywords: Date palm, Halotolerantes, endophyte, Secondary metabolites.
Procedia PDF Downloads 5194328 Approach to Honey Volatiles' Profiling by Gas Chromatography and Mass Spectrometry
Authors: Igor Jerkovic
Abstract:
Biodiversity of flora provides many different nectar sources for the bees. Unifloral honeys possess distinctive flavours, mainly derived from their nectar sources (characteristic volatile organic components (VOCs)). Specific or nonspecific VOCs (chemical markers) could be used for unifloral honey characterisation as addition to the melissopalynologycal analysis. The main honey volatiles belong, in general, to three principal categories: terpenes, norisoprenoids, and benzene derivatives. Some of these substances have been described as characteristics of the floral source, and other compounds, like several alcohols, branched aldehydes, and furan derivatives, may be related to the microbial purity of honey processing and storage conditions. Selection of the extraction method for the honey volatiles profiling should consider that heating of the honey produce different artefacts and therefore conventional methods of VOCs isolation (such as hydrodistillation) cannot be applied for the honey. Two-way approach for the isolation of the honey VOCs was applied using headspace solid-phase microextraction (HS-SPME) and ultrasonic solvent extraction (USE). The extracts were analysed by gas chromatography and mass spectrometry (GC-MS). HS-SPME (with the fibers of different polarity such as polydimethylsiloxane/ divinylbenzene (PDMS/DVB) or divinylbenzene/carboxene/ polydimethylsiloxane (DVB/CAR/PDMS)) enabled isolation of high volatile headspace VOCs of the honey samples. Among them, some characteristic or specific compounds can be found such as 3,4-dihydro-3-oxoedulan (in Centaurea cyanus L. honey) or 1H-indole, methyl anthranilate, and cis-jasmone (in Citrus unshiu Marc. honey). USE with different solvents (mainly dichloromethane or the mixture pentane : diethyl ether 1 : 2 v/v) enabled isolation of less volatile and semi-volatile VOCs of the honey samples. Characteristic compounds from C. unshiu honey extracts were caffeine, 1H-indole, 1,3-dihydro-2H-indol-2-one, methyl anthranilate, and phenylacetonitrile. Sometimes, the selection of solvent sequence was useful for more complete profiling such as sequence I: pentane → diethyl ether or sequence II: pentane → pentane/diethyl ether (1:2, v/v) → dichloromethane). The extracts with diethyl ether contained hydroquinone and 4-hydroxybenzoic acid as the major compounds, while (E)-4-(r-1’,t-2’,c-4’-trihydroxy-2’,6’,6’-trimethylcyclo-hexyl)but-3-en-2-one predominated in dichloromethane extracts of Allium ursinum L. honey. With this two-way approach, it was possible to obtain a more detailed insight into the honey volatile and semi-volatile compounds and to minimize the risks of compound discrimination due to their partial extraction that is of significant importance for the complete honey profiling and identification of the chemical biomarkers that can complement the pollen analysis.Keywords: honey chemical biomarkers, honey volatile compounds profiling, headspace solid-phase microextraction (HS-SPME), ultrasonic solvent extraction (USE)
Procedia PDF Downloads 2024327 Genomics of Adaptation in the Sea
Authors: Agostinho Antunes
Abstract:
The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.Keywords: marine genomics, evolutionary bioinformatics, human genome sequencing, genomic analyses
Procedia PDF Downloads 6114326 Implantology Failure: Epidemiological Survey among Tunisian Dentists
Authors: Faten Khanfir, Mohamed Tlili, Ali Medeb Hamrouni, Raki Selmi, M. S. Khalfi, Faten Ben Amor
Abstract:
Introduction: dental implant failure is a major concern for the clinician and the patient. Objectives: The aim of our study is to investigate the way in which 100 Tunisian dentists carried implant treatment for their patients from the early phase of planning and selection of patients to the placement of the implant in order to look for the implant failure factors. Results: significant correlations were found between failure rates > 5 and their corresponding factors as the number of implants placed (p = 0.001<0, 05), smoking (0.046 <0.05), unbalanced diabetes (0.03<0.05), aseptic protocol (= 0.004< 0.05) and the drilling speed (0,002<0.05) Conclusion: It seems that the number of implant placed, smoking, diabetes, aseptic protocol, and the drilling speed may contribute to dental implant failure.Keywords: failure, implants, survey, risk, osseointegration
Procedia PDF Downloads 1834325 Application of Artificial Intelligence in EOR
Authors: Masoumeh Mofarrah, Amir NahanMoghadam
Abstract:
Higher oil prices and increasing oil demand are main reasons for great attention to Enhanced Oil Recovery (EOR). Comprehensive researches have been accomplished to develop, appraise, and improve EOR methods and their application. Recently, Artificial Intelligence (AI) gained popularity in petroleum industry that can help petroleum engineers to solve some fundamental petroleum engineering problems such as reservoir simulation, EOR project risk analysis, well log interpretation and well test model selection. This study presents a historical overview of most popular AI tools including neural networks, genetic algorithms, fuzzy logic, and expert systems in petroleum industry and discusses two case studies to represent the application of two mentioned AI methods for selecting an appropriate EOR method based on reservoir characterization infeasible and effective way.Keywords: artificial intelligence, EOR, neural networks, expert systems
Procedia PDF Downloads 4884324 Reading and Writing Memories in Artificial and Human Reasoning
Authors: Ian O'Loughlin
Abstract:
Memory networks aim to integrate some of the recent successes in machine learning with a dynamic memory base that can be updated and deployed in artificial reasoning tasks. These models involve training networks to identify, update, and operate over stored elements in a large memory array in order, for example, to ably perform question and answer tasks parsing real-world and simulated discourses. This family of approaches still faces numerous challenges: the performance of these network models in simulated domains remains considerably better than in open, real-world domains, wide-context cues remain elusive in parsing words and sentences, and even moderately complex sentence structures remain problematic. This innovation, employing an array of stored and updatable ‘memory’ elements over which the system operates as it parses text input and develops responses to questions, is a compelling one for at least two reasons: first, it addresses one of the difficulties that standard machine learning techniques face, by providing a way to store a large bank of facts, offering a way forward for the kinds of long-term reasoning that, for example, recurrent neural networks trained on a corpus have difficulty performing. Second, the addition of a stored long-term memory component in artificial reasoning seems psychologically plausible; human reasoning appears replete with invocations of long-term memory, and the stored but dynamic elements in the arrays of memory networks are deeply reminiscent of the way that human memory is readily and often characterized. However, this apparent psychological plausibility is belied by a recent turn in the study of human memory in cognitive science. In recent years, the very notion that there is a stored element which enables remembering, however dynamic or reconstructive it may be, has come under deep suspicion. In the wake of constructive memory studies, amnesia and impairment studies, and studies of implicit memory—as well as following considerations from the cognitive neuroscience of memory and conceptual analyses from the philosophy of mind and cognitive science—researchers are now rejecting storage and retrieval, even in principle, and instead seeking and developing models of human memory wherein plasticity and dynamics are the rule rather than the exception. In these models, storage is entirely avoided by modeling memory using a recurrent neural network designed to fit a preconceived energy function that attains zero values only for desired memory patterns, so that these patterns are the sole stable equilibrium points in the attractor network. So although the array of long-term memory elements in memory networks seem psychologically appropriate for reasoning systems, they may actually be incurring difficulties that are theoretically analogous to those that older, storage-based models of human memory have demonstrated. The kind of emergent stability found in the attractor network models more closely fits our best understanding of human long-term memory than do the memory network arrays, despite appearances to the contrary.Keywords: artificial reasoning, human memory, machine learning, neural networks
Procedia PDF Downloads 2714323 Improved Accuracy of Ratio Multiple Valuation
Authors: Julianto Agung Saputro, Jogiyanto Hartono
Abstract:
Multiple valuation is widely used by investors and practitioners but its accuracy is questionable. Multiple valuation inaccuracies are due to the unreliability of information used in valuation, inaccuracies comparison group selection, and use of individual multiple values. This study investigated the accuracy of valuation to examine factors that can increase the accuracy of the valuation of multiple ratios, that are discretionary accruals, the comparison group, and the composite of multiple valuation. These results indicate that multiple value adjustment method with discretionary accruals provides better accuracy, the industry comparator group method combined with the size and growth of companies also provide better accuracy. Composite of individual multiple valuation gives the best accuracy. If all of these factors combined, the accuracy of valuation of multiple ratios will give the best results.Keywords: multiple, valuation, composite, accuracy
Procedia PDF Downloads 2824322 Sharing and Developing Cultural Heritage Values through a Co-Creative Approach
Authors: Anna Marie Fisker, Daniele Sepe, Mette Bøgh Jensen, Daniela Rimei
Abstract:
In the space of just a few years, the European policy framework on cultural heritage has been completely overhauled, moving towards a people-centred and holistic approach, and eliminating the divisions between the tangible, intangible and digital dimensions. The European Union regards cultural heritage as a potential shared resource, highlighting that all stakeholders share responsibility for its transmission to future generations. This new framework will potentially change the way in which cultural institutions manage, protect and provide access to their heritage. It will change the way in which citizens and communities engage with their cultural heritage and naturally influence the way that professionals deal with it. Participating in the creation of cultural heritage awareness can lead to an increased perception of its value, be it economic, social, environmental or cultural. It can also strengthen our personal identity, sense of belonging and community citizenship. Open Atelier, a Creative Europe project, is based on this foundation, with the goal through co-creation to develop the use, understanding and engagement with our cultural heritage. The project aim to transform selected parts of the heritage into an “experience lab” – an interactive, co-creative, dynamic and participatory space, where cultural heritage is the point of departure for new interactions and experiences between the audience and the museum and its professionals. Through a workshop-based approach built on interdisciplinary collaboration and co-creative processes, Open Atelier has started to design, develop, test, and evaluate a set of Experiences. The first collaborative initiative was set out in the discourse and knowledge of a highly creative period in Denmark where a specific group of Scandinavian artists, the Skagen Painters, gathered in the village of Skagen, the northernmost part of Denmark from the late 1870s until the turn of the century. The Art Museums of Skagen have a large collection of photos from the period, that has never been the subject of more thorough research. The photos display a variation of many different subjects: community, family photos, reproductions of art works, costume parties, family gatherings etc., and carry with them the energies of those peoples’ work and life and evoke instances of communication with the past. This paper is about how we in Open Atelier connect these special stories, this legacy, with another place, in another time, in another context and with another audience. The first Open Atelier Experience – the performance “Around the Lighthouse” – was an initiative resulted from the collaboration between AMAT, an Italian creative organisation, and the Art Museums of Skagen. A group of Italian artists developed a co-creative investigation and reinterpretation of a selection of these historical photos. A poetic journey through videos and voices, aimed at exploring new perspectives on the museum and its heritage. An experiment on how to create new ways to actively engage audiences in the way cultural heritage is explored, interpreted, mediated, presented, and used to examine contemporary issues. This article is about this experiment and its findings, and how different views and methodologies can be adopted to discuss the cultural heritage in museums around Europe and their connection to the community.Keywords: cultural heritage, community, innovation, museums
Procedia PDF Downloads 794321 Research and Development of Net-Centric Information Sharing Platform
Authors: Wang Xiaoqing, Fang Youyuan, Zheng Yanxing, Gu Tianyang, Zong Jianjian, Tong Jinrong
Abstract:
Compared with traditional distributed environment, the net-centric environment brings on more demanding challenges for information sharing with the characteristics of ultra-large scale and strong distribution, dynamic, autonomy, heterogeneity, redundancy. This paper realizes an information sharing model and a series of core services, through which provides an open, flexible and scalable information sharing platform.Keywords: net-centric environment, information sharing, metadata registry and catalog, cross-domain data access control
Procedia PDF Downloads 5704320 Mechanical Response Investigation of Wafer Probing Test with Vertical Cobra Probe via the Experiment and Transient Dynamic Simulation
Authors: De-Shin Liu, Po-Chun Wen, Zhen-Wei Zhuang, Hsueh-Chih Liu, Pei-Chen Huang
Abstract:
Wafer probing tests play an important role in semiconductor manufacturing procedures in accordance with the yield and reliability requirement of the wafer after the backend-of-the-line process. Accordingly, the stable physical and electrical contact between the probe and the tested wafer during wafer probing is regarded as an essential issue in identifying the known good die. The probe card can be integrated with multiple probe needles, which are classified as vertical, cantilever and micro-electro-mechanical systems type probe selections. Among all potential probe types, the vertical probe has several advantages as compared with other probe types, including maintainability, high probe density and feasibility for high-speed wafer testing. In the present study, the mechanical response of the wafer probing test with the vertical cobra probe on 720 μm thick silicon (Si) substrate with a 1.4 μm thick aluminum (Al) pad is investigated by the experiment and transient dynamic simulation approach. Because the deformation mechanism of the vertical cobra probe is determined by both bending and buckling mechanisms, the stable correlation between contact forces and overdrive (OD) length must be carefully verified. Moreover, the decent OD length with corresponding contact force contributed to piercing the native oxide layer of the Al pad and preventing the probing test-induced damage on the interconnect system. Accordingly, the scratch depth of the Al pad under various OD lengths is estimated by the atomic force microscope (AFM) and simulation work. In the wafer probing test configuration, the contact phenomenon between the probe needle and the tested object introduced large deformation and twisting of mesh gridding, causing the subsequent numerical divergence issue. For this reason, the arbitrary Lagrangian-Eulerian method is utilized in the present simulation work to conquer the aforementioned issue. The analytic results revealed a slight difference when the OD is considered as 40 μm, and the simulated is almost identical to the measured scratch depths of the Al pad under higher OD lengths up to 70 μm. This phenomenon can be attributed to the unstable contact of the probe at low OD length with the scratch depth below 30% of Al pad thickness, and the contact status will be being stable when the scratch depth over 30% of pad thickness. The splash of the Al pad is observed by the AFM, and the splashed Al debris accumulates on a specific side; this phenomenon is successfully simulated in the transient dynamic simulation. Thus, the preferred testing OD lengths are found as 45 μm to 70 μm, and the corresponding scratch depths on the Al pad are represented as 31.4% and 47.1% of Al pad thickness, respectively. The investigation approach demonstrated in this study contributed to analyzing the mechanical response of wafer probing test configuration under large strain conditions and assessed the geometric designs and material selections of probe needles to meet the requirement of high resolution and high-speed wafer-level probing test for thinned wafer application.Keywords: wafer probing test, vertical probe, probe mark, mechanical response, FEA simulation
Procedia PDF Downloads 574319 Identification of Stakeholders and Practices of Inclusive Education
Authors: Luis Javier Serrano-Tamayo
Abstract:
This paper focuses on the recent interest in the concept of inclusion from multiple areas of social sciences, but particularly from the academic studies on what do scholars mean when they refer to inclusive education. Therefore, this paper has been based on a three-year systematic review of near two hundred peer-reviewed documents in the last two decades. The results illustrate some of the use, misuse, and abuse of inclusive education as well as shed some light on the identification of the different stakeholders involved in the dynamic concept of inclusive education and their suggested practices.Keywords: inclusion, inclusive education, inclusive practices, education stakeholders
Procedia PDF Downloads 2374318 Gaming Tools for Efficient Low Cost Urban Planning Using Nature Based Solutions
Authors: Ioannis Kavouras, Eftychios Protopapadakis, Emmanuel Sardis, Anastasios Doulamis
Abstract:
In this paper, we investigate the appropriateness and usability of three different free and open-source rendering tools for urban planning visualizations. The process involves the selection of a map area, the 3D rendering transformation, the addition of nature-based solution placement, and the evaluation and assessment of the suggested applied interventions. The manuscript uses a case study involved at Dilaveri Coast, Piraeus region, Greece. Research outcomes indicate that a Blender-OSM implementation is an appropriate tool capable of supporting high-fidelity urban planning, with quick and accurate visibility of related results for end users and involved in NBS transformations.Keywords: urban planning, nature based solution, 3D gaming tools, game engine, free and open source
Procedia PDF Downloads 1124317 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes
Authors: Igor A. Krichtafovitch
Abstract:
The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.Keywords: supercomputer, biological evolution, Darwinism, speciation
Procedia PDF Downloads 1644316 Voice Quality in Italian-Speaking Children with Autism
Authors: Patrizia Bonaventura, Magda Di Renzo
Abstract:
This project aims to measure and assess the voice quality in children with autism. Few previous studies exist which have analyzed the voice quality of individuals with autism: abnormal voice characteristics have been found, like a high pitch, great pitch range, and sing-song quality. Existing studies did not focus specifically on Italian-speaking children’s voices and provided analysis of a few acoustic parameters. The present study aimed to gather more data and to perform acoustic analysis of the voice of children with autism in order to identify patterns of abnormal voice features that might shed some light on the causes of the dysphonia and possibly be used to create a pediatric assessment tool for early identification of autism. The participants were five native Italian-speaking boys with autism between the age of 4 years and 10 years (mean 6.8 ± SD 1.4). The children had a diagnosis of autism, were verbal, and had no other comorbid conditions (like Down syndrome or ADHD). The voices of the autistic children were recorded in the production of sustained vowels [ah] and [ih] and of sentences from the Italian version of the CAPE-V voice assessment test. The following voice parameters, representative of normal quality, were analyzed by acoustic spectrography through Praat: Speaking Fundamental Frequency, F0 range, average intensity, and dynamic range. The results showed that the pitch parameters (Speaking Fundamental Frequency and F0 range), as well as the intensity parameters (average intensity and dynamic range), were significantly different from the relative normal reference thresholds. Also, variability among children was found, so confirming a tendency revealed in previous studies of individual variation in these aspects of voice quality. The results indicate a general pattern of abnormal voice quality characterized by a high pitch and large variations in pitch and intensity. These acoustic voice characteristics found in Italian-speaking autistic children match those found in children speaking other languages, indicating that autism symptoms affecting voice quality might be independent of the native language of the children.Keywords: autism, voice disorders, speech science, acoustic analysis of voice
Procedia PDF Downloads 714315 Internal Migration and Poverty Dynamic Analysis Using a Bayesian Approach: The Tunisian Case
Authors: Amal Jmaii, Damien Rousseliere, Besma Belhadj
Abstract:
We explore the relationship between internal migration and poverty in Tunisia. We present a methodology combining potential outcomes approach with multiple imputation to highlight the effect of internal migration on poverty states. We find that probability of being poor decreases when leaving the poorest regions (the west areas) to the richer regions (greater Tunis and the east regions).Keywords: internal migration, potential outcomes approach, poverty dynamics, Tunisia
Procedia PDF Downloads 3124314 Induction Heating Process Design Using Comsol® Multiphysics Software Version 4.2a
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
Induction heating computer simulation is a powerful tool for process design and optimization, induction coil design, equipment selection, as well as education and business presentations. The authors share their vast experience in the practical use of computer simulation for different induction heating and heat treating processes. In this paper deals with mathematical modeling and numerical simulation of induction heating furnaces with axisymmetric geometries. For the numerical solution, we propose finite element methods combined with boundary (FEM) for the electromagnetic model using COMSOL® Multiphysics Software. Some numerical results for an industrial furnace are shown with high frequency.Keywords: numerical methods, induction furnaces, induction heating, finite element method, Comsol multiphysics software
Procedia PDF Downloads 449