Search results for: combined wavelet-artificial neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7719

Search results for: combined wavelet-artificial neural network

1299 Binderless Naturally-extracted Metal-free Electrocatalyst for Efficient NOₓ Reduction

Authors: Hafiz Muhammad Adeel Sharif, Tian Li, Changping Li

Abstract:

Recently, the emission of nitrogen-sulphur oxides (NOₓ, SO₂) has become a global issue and causing serious threats to health and the environment. Catalytic reduction of NOx and SOₓ gases into friendly gases is considered one of the best approaches. However, regeneration of the catalyst, higher bond-dissociation energy for NOx, i.e., 150.7 kcal/mol, escape of intermediate gas (N₂O, a greenhouse gas) with treated flue-gas, and limited activity of catalyst remains a great challenge. Here, a cheap, binderless naturally-extracted bass-wood thin carbon electrode (TCE) is presented, which shows excellent catalytic activity towards NOx reduction. The bass-wood carbonization at 900 ℃ followed by thermal activation in the presence of CO2 gas at 750 ℃. The thermal activation resulted in an increase in epoxy groups on the surface of the TCE and enhancement in the surface area as well as the degree of graphitization. The TCE unique 3D strongly inter-connected network through hierarchical micro/meso/macro pores that allow large electrode/electrolyte interface. Owing to these characteristics, the TCE exhibited excellent catalytic efficiency towards NOx (~83.3%) under ambient conditions and enhanced catalytic response under pH and sulphite exposure as well as excellent stability up to 168 hours. Moreover, a temperature-dependent activity trend was found where the highest catalytic activity was achieved at 80 ℃, beyond which the electrolyte became evaporative and resulted in a performance decrease. The designed electrocatalyst showed great potential for effective NOx-reduction, which is highly cost-effective, green, and sustainable.

Keywords: electrocatalyst, NOx-reduction, bass-wood electrode, integrated wet-scrubbing, sustainable

Procedia PDF Downloads 77
1298 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 108
1297 Stem Cell Fate Decision Depending on TiO2 Nanotubular Geometry

Authors: Jung Park, Anca Mazare, Klaus Von Der Mark, Patrik Schmuki

Abstract:

In clinical application of TiO2 implants on tooth and hip replacement, migration, adhesion and differentiation of neighboring mesenchymal stem cells onto implant surfaces are critical steps for successful bone regeneration. In a recent decade, accumulated attention has been paid on nanoscale electrochemical surface modifications on TiO2 layer for improving bone-TiO2 surface integration. We generated, on titanium surfaces, self-assembled layers of vertically oriented TiO2 nanotubes with defined diameters between 15 and 100 nm and here we show that mesenchymal stem cells finely sense TiO2 nanotubular geometry and quickly decide their cell fate either to differentiation into osteoblasts or to programmed cell death (apoptosis) on TiO2 nanotube layers. These cell fate decisions are critically dependent on nanotube size differences (15-100nm in diameters) of TiO2 nanotubes sensing by integrin clustering. We further demonstrate that nanoscale topography-sensing is feasible not only in mesenchymal stem cells but rather seems as generalized nanoscale microenvironment-cell interaction mechanism in several cell types composing bone tissue network including osteoblasts, osteoclast, endothelial cells and hematopoietic stem cells. Additionally we discuss the synergistic effect of simultaneous stimulation by nanotube-bound growth factor and nanoscale topographic cues on enhanced bone regeneration.

Keywords: TiO2 nanotube, stem cell fate decision, nano-scale microenvironment, bone regeneration

Procedia PDF Downloads 432
1296 Subway Stray Current Effects on Gas Pipelines in the City of Tehran

Authors: Mohammad Derakhshani, Saeed Reza Allahkarama, Michael Isakhani-Zakaria, Masoud Samadian, Hojjat Sharifi Rasaey

Abstract:

In order to investigate the effects of stray current from DC traction systems (subway) on cathodically protected gas pipelines, the subway and the gas network maps in the city of Tehran were superimposed and a comprehensive map was prepared. 213 intersections and about 100150 meters of parallel sections of gas pipelines were found with respect to the railway right of way which was specified for field measurements. The potential measurements data were logged for one hour in each test point. 24-hour potential monitoring was carried out in selected test points as well. Results showed that dynamic stray current from subway on pipeline potential appears as fluctuations in its static potential that is visible in the diagrams during night periods. These fluctuations can cause the pipeline potential to exit the safe zone and lead to corrosion or overprotection. In this study, a maximum potential shift of 100 mv in the pipe-to-soil potential was considered as a criterion for dynamic stray current effective presence. Results showed that a potential fluctuation range between 100 mV to 3 V exists in measured points on pipelines which exceeds the proposed criterion and needs to be investigated. Corrosion rates influenced by stray currents were calculated using coupons. Results showed that coupon linked to the pipeline in one of the locations at region 1 of the city of Tehran has a corrosion rate of 4.2 mpy (with cathodic protection and under influence of stray currents) which is about 1.5 times more than free corrosion rate of 2.6 mpy.

Keywords: stray current, DC traction, subway, buried Pipelines, cathodic protection list

Procedia PDF Downloads 822
1295 The Exploitation of the MOSES Project Outcomes on Supply Chain Optimisation

Authors: Reza Karimpour

Abstract:

Ports play a decisive role in the EU's external and internal trade, as about 74% of imports and exports and 37% of exchanges go through ports. Although ports, especially Deep Sea Shipping (DSS) ports, are integral nodes within multimodal logistic flows, Short Sea Shipping (SSS) and inland waterways are not so well integrated. The automated vessels and supply chain optimisations for sustainable shortsea shipping (MOSES) project aims to enhance the short sea shipping component of the European supply chain by addressing the vulnerabilities and strains related to the operation of large containerships. The MOSES concept can be shortly described as a large containership (mother-vessel) approaching a DSS port (or a large container terminal). Upon her arrival, a combined intelligent mega-system consisting of the MOSES Autonomous tugboat swarm for manoeuvring and the MOSES adapted AutoMoor system. Then, container handling processes are ready to start moving containers to their destination via hinterland connections (trucks and/or rail) or to be shipped to destinations near small ports (on the mainland or island). For the first case, containers are stored in a dedicated port area (Storage area), waiting to be moved via trucks and/or rail. For the second case, containers are stacked by existing port equipment near-dedicated berths of the DSS port. They then are loaded on the MOSES Innovative Feeder Vessel, equipped with the MOSES Robotic Container-Handling System that provides (semi-) autonomous (un) feeding of the feeder. The Robotic Container-Handling System is remotely monitored through a Shore Control Centre. When the MOSES innovative Feeder vessel approaches the small port, where her docking is achieved without tugboats, she automatically unloads the containers using the Robotic Container-Handling System on the quay or directly on trucks. As a result, ports with minimal or no available infrastructure may be effectively integrated with the container supply chain. Then, the MOSES innovative feeder vessel continues her voyage to the next small port, or she returns to the DSS port. MOSES exploitation activity mainly aims to exploit research outcomes beyond the project, facilitate utilisation of the pilot results by others, and continue the pilot service after the project ends. By the mid-lifetime of the project, the exploitation plan introduces the reader to the MOSES project and its key exploitable results. It provides a plan for delivering the MOSES innovations to the market as part of the overall exploitation plan.

Keywords: automated vessels, exploitation, shortsea shipping, supply chain

Procedia PDF Downloads 110
1294 Monitoring Synthesis of Biodiesel through Online Density Measurements

Authors: Arnaldo G. de Oliveira, Jr, Matthieu Tubino

Abstract:

The transesterification process of triglycerides with alcohols that occurs during the biodiesel synthesis causes continuous changes in several physical properties of the reaction mixture, such as refractive index, viscosity and density. Amongst them, density can be an useful parameter to monitor the reaction, in order to predict the composition of the reacting mixture and to verify the conversion of the oil into biodiesel. In this context, a system was constructed in order to continuously determine changes in the density of the reacting mixture containing soybean oil, methanol and sodium methoxide (30 % w/w solution in methanol), stirred at 620 rpm at room temperature (about 27 °C). A polyethylene pipe network connected to a peristaltic pump was used in order to collect the mixture and pump it through a coil fixed on the plate of an analytical balance. The collected mass values were used to trace a curve correlating the mass of the system to the reaction time. The density variation profile versus the time clearly shows three different steps: 1) the dispersion of methanol in oil causes a decrease in the system mass due to the lower alcohol density followed by stabilization; 2) the addition of the catalyst (sodium methoxide) causes a larger decrease in mass compared to the first step (dispersion of methanol in oil) because of the oil conversion into biodiesel; 3) the final stabilization, denoting the end of the reaction. This density variation profile provides information that was used to predict the composition of the mixture over the time and the reaction rate. The precise knowledge of the duration of the synthesis means saving time and resources on a scale production system. This kind of monitoring provides several interesting features such as continuous measurements without collecting aliquots.

Keywords: biodiesel, density measurements, online continuous monitoring, synthesis

Procedia PDF Downloads 575
1293 New Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques, and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then, dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is an arbitrary nonempty closed subset of the real numbers. Then, the dynamic inequalities on time scales have received a lot of attention in the literature and has become a major field in pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on Hardy and Coposon inequalities, using Steklov operator on time scale in double integrals to obtain special cases of time-scale inequalities of Hardy and Copson on high dimensions. The advantage of this study is that it uses the one-dimensional classical Hardy inequality to obtain higher dimensional on time scale versions that will be applied in the solution of the Cauchy problem for the wave equation. In addition, the obtained inequalities have various applications involving discontinuous domains such as bug populations, phytoremediation of metals, wound healing, maximization problems. The proof can be done by introducing restriction on the operator in several cases. The concepts in time scale version such as time scales calculus will be used that allows to unify and extend many problems from the theories of differential and of difference equations. In addition, using chain rule, and some properties of multiple integrals on time scales, some theorems of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of hardy, inequality of coposon, steklov operator

Procedia PDF Downloads 95
1292 Evaluation of Cardiac Rhythm Patterns after Open Surgical Maze-Procedures from Three Years' Experiences in a Single Heart Center

Authors: J. Yan, B. Pieper, B. Bucsky, H. H. Sievers, B. Nasseri, S. A. Mohamed

Abstract:

In order to optimize the efficacy of medications, the regular follow-up with long-term continuous monitoring of heart rhythmic patterns has been facilitated since clinical introduction of cardiac implantable electronic monitoring devices (CIMD). Extensive analysis of rhythmic circadian properties is capable to disclose the distributions of arrhythmic events, which may support appropriate medication according rate-/rhythm-control strategy and minimize consequent afflictions. 348 patients (69 ± 0.5ys, male 61.8%) with predisposed atrial fibrillation (AF), undergoing primary ablating therapies combined to coronary or valve operations and secondary implantation of CIMDs, were involved and divided into 3 groups such as PAAF (paroxysmal AF) (n=99, male 68.7%), PEAF (persistent AF) (n=94, male 62.8%), and LSPEAF (long-standing persistent AF) (n=155, male 56.8%). All patients participated in three-year ambulant follow-up (3, 6, 9, 12, 18, 24, 30 and 36 months). Burdens of atrial fibrillation recurrence were assessed using cardiac monitor devices, whereby attacks frequencies and their circadian patterns were systemically analyzed. Anticoagulants and regular anti-arrhythmic medications were evaluated and the last were listed in terms of anti-rate and anti-rhythm regimens. Patients in the PEAF-group showed the least AF-burden after surgical ablating procedures compared to both of the other subtypes (p < 0.05). The AF-recurrences predominantly performed such attacks’ property as shorter than one hour, namely within 10 minutes (p < 0.05), regardless of AF-subtypes. Concerning circadian distribution of the recurrence attacks, frequent AF-attacks were mostly recorded in the morning in the PAAF-group (p < 0.05), while the patients with predisposed PEAF complained less attack-induced discomforts in the latter half of the night and the ones with LSPEAF only if they were not physically active after primary surgical ablations. Different AF-subtypes presented distinct therapeutic efficacies after appropriate surgical ablating procedures and recurrence properties in sense of circadian distribution. An optimization of medical regimen and drug dosages to maintain the therapeutic success needs more attention to detailed assessment of the long-term follow-up. Rate-control strategy plays a much more important role than rhythm-control in the ongoing follow-up examinations.

Keywords: atrial fibrillation, CIMD, MAZE, rate-control, rhythm-control, rhythm patterns

Procedia PDF Downloads 156
1291 An Exploratory Study on the Effect of a Fermented Dairy Product on Self-Reported Gut Complaints in US Recreational Athletes

Authors: Kersch-Counet C., Fransen K. H. S., Broyd M., Nyakayiru J. D. O. A., Schoemaker M. H., Mallee L. F., Bovee-Oudenhoven I. M. J.

Abstract:

Background: Around one third of people, including athletes, suffer from feelings of gut discomfort. Fermentation of dairy is a process that has been associated with products that can improve gut health. However, insight in (potential) health benefits of most fermented foods is limited to chemical analyses and in-vitro models. Objective: The aim of this open-label, single-arm explorative trial was to investigate in a real life setting the effect of consumption of a fermented whey product for 3 weeks on self-perceived physical and mental wellbeing and digestive issues in 150 US recreational athletes (20-50 years of age) with self-reported gut complaints at enrolment. Methods: Participants living at the West-Coast of the US received for 3 weeks a daily powder of 15 g of BiotisTM Fermentis to be mixed in water using a supplied shaker. Weekly questionnaires were conducted by MMR research to study the effect on physical/mental health issues and self-perceived gut complaints. Non-parametric tests (e.g., Friedman test) were used to assess statistical differences over time while the Kruskal-Wallis and Wilcoxon signed-rank tests were used for sub-groups analysis. Results: Bloating, stress and anxiety were the top 3 issues of the US recreational athletes. Satisfaction of physical wellbeing increased significantly throughout the 3-weeks of fermented whey product consumption (p<0.0005). Combined digestive issues decreased significantly after 2- and 3-weeks of product consumption, with bloating showing a significant reduction (p<0.05). There was a trend that self-reported stress levels reduced after 3 weeks and participants said to significantly feel more active, energetic, and vital (p<0.05). Subgroup analysis showed that gender and habitual protein supplement consumption were associated with specific health issues and modulated the response to the fermented dairy product. Conclusion: Daily consumption of the fermented BiotisTM Fermentis product is associated with a reduction in self-perceived gastrointestinal symptoms and improved overall wellbeing and mood state in US recreational athletes. This large nutrition and health consumer study brings valuable insights in self-reported gut complaints of recreational athletes in the US and their response to a fermented dairy product. A controlled clinical trial in a targeted population is recommended to scientifically substantiate the product effect as observed in this explorative study.

Keywords: real-life study, digestive health, fermented whey, sports

Procedia PDF Downloads 269
1290 The Spatial Analysis of Wetland Ecosystem Services Valuation on Flood Protection in Tone River Basin

Authors: Tingting Song

Abstract:

Wetlands are significant ecosystems that provide a variety of ecosystem services for humans, such as, providing water and food resources, purifying water quality, regulating climate, protecting biodiversity, and providing cultural, recreational, and educational resources. Wetlands also provide benefits, such as reduction of flood, storm damage, and soil erosion. The flood protection ecosystem services of wetlands are often ignored. Due to climate change, the flood caused by extreme weather in recent years occur frequently. Flood has a great impact on people's production and life with more and more economic losses. This study area is in the Tone river basin in the Kanto area, Japan. It is the second-longest river with the largest basin area in Japan, and it is still suffering heavy economic losses from floods. Tone river basin is one of the rivers that provide water for Tokyo and has an important impact on economic activities in Japan. The purpose of this study was to investigate land-use changes of wetlands in the Tone River Basin, and whether there are spatial differences in the value of wetland functions in mitigating economic losses caused by floods. This study analyzed the land-use change of wetland in Tone River, based on the Landsat data from 1980 to 2020. Combined with flood economic loss, wetland area, GDP, population density, and other social-economic data, a geospatial weighted regression model was constructed to analyze the spatial difference of wetland ecosystem service value. Now, flood protection mainly relies on such a hard project of dam and reservoir, but excessive dependence on hard engineering will cause the government huge financial pressure and have a big impact on the ecological environment. However, natural wetlands can also play a role in flood management, at the same time they can also provide diverse ecosystem services. Moreover, the construction and maintenance cost of natural wetlands is lower than that of hard engineering. Although it is not easy to say which is more effective in terms of flood management. When the marginal value of a wetland is greater than the economic loss caused by flood per unit area, it may be considered to rely on the flood storage capacity of the wetland to reduce the impact of the flood. It can promote the sustainable development of wetlands ecosystem. On the other hand, spatial analysis of wetland values can provide a more effective strategy for flood management in the Tone river basin.

Keywords: wetland, geospatial weighted regression, ecosystem services, environment valuation

Procedia PDF Downloads 101
1289 Effects of Ubiquitous 360° Learning Environment on Clinical Histotechnology Competence

Authors: Mari A. Virtanen, Elina Haavisto, Eeva Liikanen, Maria Kääriäinen

Abstract:

Rapid technological development and digitalization has affected also on higher education. During last twenty years multiple of electronic and mobile learning (e-learning, m-learning) platforms have been developed and have become prevalent in many universities and in the all fields of education. Ubiquitous learning (u-learning) is not that widely known or used. Ubiquitous learning environments (ULE) are the new era of computer-assisted learning. They are based on ubiquitous technology and computing that fuses the learner seamlessly into learning process by using sensing technology as tags, badges or barcodes and smart devices like smartphones and tablets. ULE combines real-life learning situations into virtual aspects and can be flexible used in anytime and anyplace. The aim of this study was to assess the effects of ubiquitous 360 o learning environment on higher education students’ clinical histotechnology competence. A quasi-experimental study design was used. 57 students in biomedical laboratory science degree program was assigned voluntarily to experiment (n=29) and to control group (n=28). Experimental group studied via ubiquitous 360o learning environment and control group via traditional web-based learning environment (WLE) in a 8-week educational intervention. Ubiquitous 360o learning environment (ULE) combined authentic learning environment (histotechnology laboratory), digital environment (virtual laboratory), virtual microscope, multimedia learning content, interactive communication tools, electronic library and quick response barcodes placed into authentic laboratory. Web-based learning environment contained equal content and components with the exception of the use of mobile device, interactive communication tools and quick response barcodes. Competence of clinical histotechnology was assessed by using knowledge test and self-report developed for this study. Data was collected electronically before and after clinical histotechnology course and analysed by using descriptive statistics. Differences among groups were identified by using Wilcoxon test and differences between groups by using Mann-Whitney U-test. Statistically significant differences among groups were identified in both groups (p<0.001). Competence scores in post-test were higher in both groups, than in pre-test. Differences between groups were very small and not statistically significant. In this study the learning environment have developed based on 360o technology and successfully implemented into higher education context. And students’ competence increases when ubiquitous learning environment were used. In the future, ULE can be used as a learning management system for any learning situation in health sciences. More studies are needed to show differences between ULE and WLE.

Keywords: competence, higher education, histotechnology, ubiquitous learning, u-learning, 360o

Procedia PDF Downloads 286
1288 NANCY: Combining Adversarial Networks with Cycle-Consistency for Robust Multi-Modal Image Registration

Authors: Mirjana Ruppel, Rajendra Persad, Amit Bahl, Sanja Dogramadzi, Chris Melhuish, Lyndon Smith

Abstract:

Multimodal image registration is a profoundly complex task which is why deep learning has been used widely to address it in recent years. However, two main challenges remain: Firstly, the lack of ground truth data calls for an unsupervised learning approach, which leads to the second challenge of defining a feasible loss function that can compare two images of different modalities to judge their level of alignment. To avoid this issue altogether we implement a generative adversarial network consisting of two registration networks GAB, GBA and two discrimination networks DA, DB connected by spatial transformation layers. GAB learns to generate a deformation field which registers an image of the modality B to an image of the modality A. To do that, it uses the feedback of the discriminator DB which is learning to judge the quality of alignment of the registered image B. GBA and DA learn a mapping from modality A to modality B. Additionally, a cycle-consistency loss is implemented. For this, both registration networks are employed twice, therefore resulting in images ˆA, ˆB which were registered to ˜B, ˜A which were registered to the initial image pair A, B. Thus the resulting and initial images of the same modality can be easily compared. A dataset of liver CT and MRI was used to evaluate the quality of our approach and to compare it against learning and non-learning based registration algorithms. Our approach leads to dice scores of up to 0.80 ± 0.01 and is therefore comparable to and slightly more successful than algorithms like SimpleElastix and VoxelMorph.

Keywords: cycle consistency, deformable multimodal image registration, deep learning, GAN

Procedia PDF Downloads 131
1287 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data

Authors: M. Kharrat, G. Moreau, Z. Aboura

Abstract:

The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.

Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition

Procedia PDF Downloads 155
1286 Contribution of the Corn Milling Industry to a Global and Circular Economy

Authors: A. B. Moldes, X. Vecino, L. Rodriguez-López, J. M. Dominguez, J. M. Cruz

Abstract:

The concept of the circular economy is focus on the importance of providing goods and services sustainably. Thus, in a future it will be necessary to respond to the environmental contamination and to the use of renewables substrates by moving to a more restorative economic system that drives towards the utilization and revalorization of residues to obtain valuable products. During its evolution our industrial economy has hardly moved through one major characteristic, established in the early days of industrialization, based on a linear model of resource consumption. However, this industrial consumption system will not be maintained during long time. On the other hand, there are many industries, like the corn milling industry, that although does not consume high amount of non renewable substrates, they produce valuable streams that treated accurately, they could provide additional, economical and environmental, benefits by the extraction of interesting commercial renewable products, that can replace some of the substances obtained by chemical synthesis, using non renewable substrates. From this point of view, the use of streams from corn milling industry to obtain surface-active compounds will decrease the utilization of non-renewables sources for obtaining this kind of compounds, contributing to a circular and global economy. However, the success of the circular economy depends on the interest of the industrial sectors in the revalorization of their streams by developing relevant and new business models. Thus, it is necessary to invest in the research of new alternatives that reduce the consumption of non-renewable substrates. In this study is proposed the utilization of a corn milling industry stream to obtain an extract with surfactant capacity. Once the biosurfactant is extracted, the corn milling stream can be commercialized as nutritional media in biotechnological process or as animal feed supplement. Usually this stream is combined with other ingredients obtaining a product namely corn gluten feed or may be sold separately as a liquid protein source for beef and dairy feeding, or as a nutritional pellet binder. Following the productive scheme proposed in this work, the corn milling industry will obtain a biosurfactant extract that could be incorporated in its productive process replacing those chemical detergents, used in some point of its productive chain, or it could be commercialized as a new product of the corn manufacture. The biosurfactants obtained from corn milling industry could replace the chemical surfactants in many formulations, and uses, and it supposes an example of the potential that many industrial streams could offer for obtaining valuable products when they are manage properly.

Keywords: biosurfactantes, circular economy, corn, sustainability

Procedia PDF Downloads 261
1285 Passing-On Cultural Heritage Knowledge: Entrepreneurial Approaches for a Higher Educational Sustainability

Authors: Ioana Simina Frincu

Abstract:

As institutional initiatives often fail to provide good practices when it comes to heritage management or to adapt to the changing environment in which they function and to the audiences they address, private actions represent viable strategies for sustainable knowledge acquisition. Information dissemination to future generations is one of the key aspects in preserving cultural heritage and is successfully feasible even in the absence of original artifacts. Combined with the (re)discovery of natural landscape, open-air exploratory approaches (archeoparks) versus an enclosed monodisciplinary rigid framework (traditional museums) are more likely to 'speak the language' of a larger number of people, belonging to a variety of categories, ages, and professions. Interactive sites are efficient ways of stimulating heritage awareness and increasing the number of visitors of non-interactive/static cultural institutions owning original pieces of history, delivering specialized information, and making continuous efforts to preserve historical evidence (relics, manuscripts, etc.). It is high time entrepreneurs took over the role of promoting cultural heritage, bet it under a more commercial yet more attractive form (business). Inclusive, participatory type of activities conceived by experts from different domains/fields (history, anthropology, tourism, sociology, business management, integrative sustainability, etc.) have better chances to ensure long term cultural benefits for both adults and children, especially when and where the educational discourse fails. These unique self-experience leisure activities, which offer everyone the opportunity to recreate history by him-/her-self, to relive the ancestors’ way of living, surviving and exploring should be regarded not as pseudo-scientific approaches but as important pre-steps to museum experiences. In order to support this theory, focus will be laid on two different examples: one dynamic, in the outdoors (the Boario Terme Archeopark from Italy) and one experimental, held indoor (the reconstruction of the Neolithic sanctuary of Parta, Romania as part of a transdisciplinary academic course) and their impact on young generations. The conclusion of this study shows that the increasingly lower engagement of youth (students) in discovering and understanding history, archaeology, and heritage can be revived by entrepreneurial projects.

Keywords: archeopark, educational tourism, open air museum, Parta sanctuary, prehistory

Procedia PDF Downloads 139
1284 Evaluation of Dry Matter Yield of Panicum maximum Intercropped with Pigeonpea and Sesbania Sesban

Authors: Misheck Musokwa, Paramu Mafongoya, Simon Lorentz

Abstract:

Seasonal shortages of fodder during the dry season is a major constraint to smallholder livestock farmers in South Africa. To mitigate the shortage of fodder, legume trees can be intercropped with pastures which can diversify the sources of feed and increase the amount of protein for grazing animals. The objective was to evaluate dry matter yield of Panicum maximum and land productivity under different fodder production systems during 2016/17-2017/18 seasons at Empangeni (28.6391° S and 31.9400° E). A randomized complete block design, replicated three times was used, the treatments were sole Panicum maximum, Panicum maximum + Sesbania sesban, Panicum maximum + pigeonpea, sole Sesbania sesban, Sole pigeonpea. Three months S.sesbania seedlings were transplanted whilst pigeonpea was direct seeded at spacing of 1m x 1m. P. maximum seeds were drilled at a respective rate of 7.5 kg/ha having an inter-row spacing of 0.25 m apart. In between rows of trees P. maximum seeds were drilled. The dry matter yield harvesting times were separated by six months’ timeframe. A 0.25 m² quadrant randomly placed on 3 points on the plot was used as sampling area during harvesting P. maximum. There was significant difference P < 0.05 across 3 harvests and total dry matter. P. maximum had higher dry matter yield as compared to both intercrops at first harvest and total. The second and third harvest had no significant difference with pigeonpea intercrop. The results was in this order for all 3 harvest: P. maximum (541.2c, 1209.3b and 1557b) kg ha¹ ≥ P. maximum + pigeonpea (157.2b, 926.7b and 1129b) kg ha¹ > P. maximum + S. sesban (36.3a, 282a and 555a) kg ha¹. Total accumulation of dry matter yield of P. maximum (3307c kg ha¹) > P. maximum + pigeonpea (2212 kg ha¹) ≥ P. maximum + S. sesban (874 kg ha¹). There was a significant difference (P< 0.05) on seed yield for trees. Pigeonpea (1240.3 kg ha¹) ≥ Pigeonpea + P. maximum (862.7 kg ha¹) > S.sesbania (391.9 kg ha¹) ≥ S.sesbania + P. maximum. The Land Equivalent Ratio (LER) was in the following order P. maximum + pigeonpea (1.37) > P. maximum + S. sesban (0.84) > Pigeonpea (0.59) ≥ S. Sesbania (0.57) > P. maximum (0.26). Results indicates that it is beneficial to have P. maximum intercropped with pigeonpea because of higher land productivity. Planting grass with pigeonpea was more beneficial than S. sesban with grass or sole cropping in terms of saving the shortage of arable land. P. maximum + pigeonpea saves a substantial (37%) land which can be subsequently be used for other crop production. Pigeonpea is recommended as an intercrop with P. maximum due to its higher LER and combined production of livestock feed, human food, and firewood. Panicum grass is low in crude protein though high in carbohydrates, there is a need for intercropping it with legume trees. A farmer who buys concentrates can reduce costs by combining P. maximum with pigeonpea this will provide a balanced diet at low cost.

Keywords: fodder, livestock, productivity, smallholder farmers

Procedia PDF Downloads 149
1283 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques

Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet

Abstract:

5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.

Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics

Procedia PDF Downloads 63
1282 Evotrader: Bitcoin Trading Using Evolutionary Algorithms on Technical Analysis and Social Sentiment Data

Authors: Martin Pellon Consunji

Abstract:

Due to the rise in popularity of Bitcoin and other crypto assets as a store of wealth and speculative investment, there is an ever-growing demand for automated trading tools, such as bots, in order to gain an advantage over the market. Traditionally, trading in the stock market was done by professionals with years of training who understood patterns and exploited market opportunities in order to gain a profit. However, nowadays a larger portion of market participants are at minimum aided by market-data processing bots, which can generally generate more stable signals than the average human trader. The rise in trading bot usage can be accredited to the inherent advantages that bots have over humans in terms of processing large amounts of data, lack of emotions of fear or greed, and predicting market prices using past data and artificial intelligence, hence a growing number of approaches have been brought forward to tackle this task. However, the general limitation of these approaches can still be broken down to the fact that limited historical data doesn’t always determine the future, and that a lot of market participants are still human emotion-driven traders. Moreover, developing markets such as those of the cryptocurrency space have even less historical data to interpret than most other well-established markets. Due to this, some human traders have gone back to the tried-and-tested traditional technical analysis tools for exploiting market patterns and simplifying the broader spectrum of data that is involved in making market predictions. This paper proposes a method which uses neuro evolution techniques on both sentimental data and, the more traditionally human-consumed, technical analysis data in order to gain a more accurate forecast of future market behavior and account for the way both automated bots and human traders affect the market prices of Bitcoin and other cryptocurrencies. This study’s approach uses evolutionary algorithms to automatically develop increasingly improved populations of bots which, by using the latest inflows of market analysis and sentimental data, evolve to efficiently predict future market price movements. The effectiveness of the approach is validated by testing the system in a simulated historical trading scenario, a real Bitcoin market live trading scenario, and testing its robustness in other cryptocurrency and stock market scenarios. Experimental results during a 30-day period show that this method outperformed the buy and hold strategy by over 260% in terms of net profits, even when taking into consideration standard trading fees.

Keywords: neuro-evolution, Bitcoin, trading bots, artificial neural networks, technical analysis, evolutionary algorithms

Procedia PDF Downloads 123
1281 Glutamine Supplementation and Resistance Traning on Anthropometric Indices, Immunoglobulins, and Cortisol Levels

Authors: Alireza Barari, Saeed Shirali, Ahmad Abdi

Abstract:

Introduction: Exercise has contradictory effects on the immune system. Glutamine supplementation may increase the resistance of the immune system in athletes. The Glutamine is one of the most recognized immune nutrients that as a fuel source, substrate in the synthesis of nucleotides and amino acids and is also known to be part of the antioxidant defense. Several studies have shown that improving glutamine levels in plasma and tissues can have beneficial effects on the function of immune cells such as lymphocytes and neutrophils. This study aimed to investigate the effects of resistance training and training combined with glutamine supplementation to improve the levels of cortisol and immunoglobulin in untrained young men. The research shows that physical training can increase the cytokines in the athlete’s body of course; glutamine can counteract the negative effects of resistance training on immune function and stability of the mast cell membrane. Materials and methods: This semi-experimental study was conducted on 30 male non-athletes. They were randomly divided into three groups: control (no exercise), resistance training, resistance training and glutamine supplementation, respectively. Resistance training for 4 weeks and glutamine supplementation in 0.3 gr/kg/day after practice was applied. The resistance-training program consisted of eight exercises (leg press, lat pull, chest press, squat, seatedrow, abdominal crunch, shoulder press, biceps curl and triceps press down) four times per week. Participants performed 3 sets of 10 repetitions at 60–75% 1-RM. Anthropometry indexes (weight, body mass index, and body fat percentage), oxygen uptake (VO2max) Maximal, cortisol levels of immunoglobulins (IgA, IgG, IgM) were evaluated Pre- and post-test. Results: Results showed four week resistance training with and without glutamine cause significant increase in body weight, BMI and significantly decreased (P < 0/001) in BF. Vo2max also increased in both groups of exercise (P < 0/05) and exercise with glutamine (P < 0/001), such as in both groups significant reduction in IgG (P < 0/05) was observed. But no significant difference observed in levels of cortisol, IgA, IgM in any of the groups. No significant change observed in either parameter in the control group. No significant difference observed between the groups. Discussion: The alterations in the hormonal and immunological parameters can be used in order to assess the effect overload on the body, whether acute or chronically. The plasmatic concentration of glutamine has been associated to the functionality of the immunological system in individuals sub-mitted to intense physical training. resistance training has destructive effects on the immune system and glutamine supplementation cannot neutralize the damaging effects of power exercise on the immune system.

Keywords: glutamine, resistance traning, immuglobulins, cortisol

Procedia PDF Downloads 479
1280 Flow Field Optimization for Proton Exchange Membrane Fuel Cells

Authors: Xiao-Dong Wang, Wei-Mon Yan

Abstract:

The flow field design in the bipolar plates affects the performance of the proton exchange membrane (PEM) fuel cell. This work adopted a combined optimization procedure, including a simplified conjugate-gradient method and a completely three-dimensional, two-phase, non-isothermal fuel cell model, to look for optimal flow field design for a single serpentine fuel cell of size 9×9 mm with five channels. For the direct solution, the two-fluid method was adopted to incorporate the heat effects using energy equations for entire cells. The model assumes that the system is steady; the inlet reactants are ideal gases; the flow is laminar; and the porous layers such as the diffusion layer, catalyst layer and PEM are isotropic. The model includes continuity, momentum and species equations for gaseous species, liquid water transport equations in the channels, gas diffusion layers, and catalyst layers, water transport equation in the membrane, electron and proton transport equations. The Bulter-Volumer equation was used to describe electrochemical reactions in the catalyst layers. The cell output power density Pcell is maximized subjected to an optimal set of channel heights, H1-H5, and channel widths, W2-W5. The basic case with all channel heights and widths set at 1 mm yields a Pcell=7260 Wm-2. The optimal design displays a tapered characteristic for channels 1, 3 and 4, and a diverging characteristic in height for channels 2 and 5, producing a Pcell=8894 Wm-2, about 22.5% increment. The reduced channel heights of channels 2-4 significantly increase the sub-rib convection and widths for effectively removing liquid water and oxygen transport in gas diffusion layer. The final diverging channel minimizes the leakage of fuel to outlet via sub-rib convection from channel 4 to channel 5. Near-optimal design without huge loss in cell performance but is easily manufactured is tested. The use of a straight, final channel of 0.1 mm height has led to 7.37% power loss, while the design with all channel widths to be 1 mm with optimal channel heights obtained above yields only 1.68% loss of current density. The presence of a final, diverging channel has greater impact on cell performance than the fine adjustment of channel width at the simulation conditions set herein studied.

Keywords: optimization, flow field design, simplified conjugate-gradient method, serpentine flow field, sub-rib convection

Procedia PDF Downloads 296
1279 Global Historical Distribution Range of Brown Bear (Ursus Arctos)

Authors: Tariq Mahmood, Faiza Lehrasab, Faraz Akrim, Muhammad Sajid nadeem, Muhammad Mushtaq, Unza waqar, Ayesha Sheraz, Shaista Andleeb

Abstract:

Brown bear (Ursus arctos), a member of the family Ursidae, is distributed in a wide range of habitats in North America, Europe and Asia. Suspectedly, the global distribution range of brown bears is decreasing at the moment due to various factors. The carnivore species is categorized as ‘Least Concern’ globally by the IUCN Red List of Threatened Species. However, there are some fragmented, small populations that are on the verge of extinction, as is in Pakistan, where the species is listed as ‘Critically Endangered’, with a declining population trend. Importantly, the global historical distribution range of brown bears is undocumented. Therefore, in the current study, we reconstructed and estimated the historical distribution range of brown bears using QGIS software and also analyzed the network of protected areas in the past and current ranges of the species. Results showed that brown bear was more widely distributed in historic times, encompassing 52.6 million km² area as compared to their current distribution of 38.8 million km², resulting in a total range contraction of up to approximately 28 %. In the past, a total of N = 62,234 protected Areas, covering approximately 3.89 million km² were present in the distribution range of the species, while now a total of N= 33,313 Protected Areas, covering approximately 2.75 million km² area, are present in the current distribution range of the brown bear. The brown bear distribution range in the protected areas has also contracted by 1.15 million km² and the total percentage reduction of PAs is 29%.

Keywords: brown bear, historic distribution, range contraction, protected areas

Procedia PDF Downloads 60
1278 Genetic Diversity and Variation of Nigerian Pigeon (Columba livia domestica) Populations Based on the Mitochondrial Coi Gene

Authors: Foluke E. Sola-Ojo, Ibraheem A. Abubakar, Semiu F. Bello, Isiaka H. Fatima, Sule Bisola, Adesina M. Olusegun, Adeniyi C. Adeola

Abstract:

The domesticated pigeon, Columba livia domestica, has many valuable characteristics, including high nutritional value and fast growth rate. There is a lack of information on its genetic diversity in Nigeria; thus, the genetic variability in mitochondrial cytochrome oxidase subunit I (COI) sequences of 150 domestic pigeons from four different locations was examined. Three haplotypes (HT) were identified in Nigerian populations; the most common haplotype, HT1, was shared with wild and domestic pigeons from Europe, America, and Asia, while HT2 and HT3 were unique to Nigeria. The overall haplotype diversity was 0.052± 0.025, and nucleotide diversity was 0.026± 0.068 across the four investigated populations. The phylogenetic tree showed significant clustering and genetic relationship of Nigerian domestic pigeons with other global pigeons. The median-joining network showed a star-like pattern suggesting population expansion. AMOVA results indicated that genetic variations in Nigerian pigeons mainly occurred within populations (99.93%), while the Neutrality tests results suggested that the Nigerian domestic pigeons’ population experienced recent expansion. This study showed a low genetic diversity and population differentiation among Nigerian domestic pigeons consistent with a relatively conservative COI sequence with few polymorphic sites. Furthermore, the COI gene could serve as a candidate molecular marker to investigate the genetic diversity and origin of pigeon species. The current data is insufficient for further conclusions; therefore, more research evidence from multiple molecular markers is required.

Keywords: Nigeria pigeon, COI, genetic diversity, genetic variation, conservation

Procedia PDF Downloads 195
1277 Role of Grey Scale Ultrasound Including Elastography in Grading the Severity of Carpal Tunnel Syndrome - A Comparative Cross-sectional Study

Authors: Arjun Prakash, Vinutha H., Karthik N.

Abstract:

BACKGROUND: Carpal tunnel syndrome (CTS) is a common entrapment neuropathy with an estimated prevalence of 0.6 - 5.8% in the general adult population. It is caused by compression of the Median Nerve (MN) at the wrist as it passes through a narrow osteofibrous canal. Presently, the diagnosis is established by the clinical symptoms and physical examination and Nerve conduction study (NCS) is used to assess its severity. However, it is considered to be painful, time consuming and expensive, with a false-negative rate between 16 - 34%. Ultrasonography (USG) is now increasingly used as a diagnostic tool in CTS due to its non-invasive nature, increased accessibility and relatively low cost. Elastography is a newer modality in USG which helps to assess stiffness of tissues. However, there is limited available literature about its applications in peripheral nerves. OBJECTIVES: Our objectives were to measure the Cross-Sectional Area (CSA) and elasticity of MN at the carpal tunnel using Grey scale Ultrasonography (USG), Strain Elastography (SE) and Shear Wave Elastography (SWE). We also made an attempt to independently evaluate the role of Gray scale USG, SE and SWE in grading the severity of CTS, keeping NCS as the gold standard. MATERIALS AND METHODS: After approval from the Institutional Ethics Review Board, we conducted a comparative cross sectional study for a period of 18 months. The participants were divided into two groups. Group A consisted of 54 patients with clinically diagnosed CTS who underwent NCS, and Group B consisted of 50 controls without any clinical symptoms of CTS. All Ultrasound examinations were performed on SAMSUNG RS 80 EVO Ultrasound machine with 2 - 9 Mega Hertz linear probe. In both groups, CSA of the MN was measured on Grey scale USG, and its elasticity was measured at the carpal tunnel (in terms of Strain ratio and Shear Modulus). The variables were compared between both groups by using ‘Independent t test’, and subgroup analyses were performed using one-way analysis of variance. Receiver operating characteristic curves were used to evaluate the diagnostic performance of each variable. RESULTS: The mean CSA of the MN was 13.60 + 3.201 mm2 and 9.17 + 1.665 mm2 in Group A and Group B, respectively (p < 0.001). The mean SWE was 30.65 + 12.996 kPa and 17.33 + 2.919 kPa in Group A and Group B, respectively (p < 0.001), and the mean Strain ratio was 7.545 + 2.017 and 5.802 + 1.153 in Group A and Group B respectively (p < 0.001). CONCLUSION: The combined use of Gray scale USG, SE and SWE is extremely useful in grading the severity of CTS and can be used as a painless and cost-effective alternative to NCS. Early diagnosis and grading of CTS and effective treatment is essential to avoid permanent nerve damage and functional disability.

Keywords: carpal tunnel, ultrasound, elastography, nerve conduction study

Procedia PDF Downloads 101
1276 Detection of Alzheimer's Protein on Nano Designed Polymer Surfaces in Water and Artificial Saliva

Authors: Sevde Altuntas, Fatih Buyukserin

Abstract:

Alzheimer’s disease is responsible for irreversible neural damage of brain parts. One of the disease markers is Amyloid-β 1-42 protein that accumulates in the brain in the form plaques. The basic problem for detection of the protein is the low amount of protein that cannot be detected properly in body liquids such as blood, saliva or urine. To solve this problem, tests like ELISA or PCR are proposed which are expensive, require specialized personnel and can contain complex protocols. Therefore, Surface-enhanced Raman Spectroscopy (SERS) a good candidate for detection of Amyloid-β 1-42 protein. Because the spectroscopic technique can potentially allow even single molecule detection from liquid and solid surfaces. Besides SERS signal can be improved by using nanopattern surface and also is specific to molecules. In this context, our study proposes to fabricate diagnostic test models that utilize Au-coated nanopatterned polycarbonate (PC) surfaces modified with Thioflavin - T to detect low concentrations of Amyloid-β 1-42 protein in water and artificial saliva medium by the enhancement of protein SERS signal. The nanopatterned PC surface that was used to enhance SERS signal was fabricated by using Anodic Alumina Membranes (AAM) as a template. It is possible to produce AAMs with different column structures and varying thicknesses depending on voltage and anodization time. After fabrication process, the pore diameter of AAMs can be arranged with dilute acid solution treatment. In this study, two different columns structures were prepared. After a surface modification to decrease their surface energy, AAMs were treated with PC solution. Following the solvent evaporation, nanopatterned PC films with tunable pillared structures were peeled off from the membrane surface. The PC film was then modified with Au and Thioflavin-T for the detection of Amyloid-β 1-42 protein. The protein detection studies were conducted first in water via this biosensor platform. Same measurements were conducted in artificial saliva to detect the presence of Amyloid Amyloid-β 1-42 protein. SEM, SERS and contact angle measurements were carried out for the characterization of different surfaces and further demonstration of the protein attachment. SERS enhancement factor calculations were also completed via experimental results. As a result, our research group fabricated diagnostic test models that utilize Au-coated nanopatterned polycarbonate (PC) surfaces modified with Thioflavin-T to detect low concentrations of Alzheimer’s Amiloid – β protein in water and artificial saliva medium. This work was supported by The Scientific and Technological Research Council of Turkey (TUBITAK) Grant No: 214Z167.

Keywords: alzheimer, anodic aluminum oxide, nanotopography, surface enhanced Raman spectroscopy

Procedia PDF Downloads 291
1275 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 74
1274 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing

Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais

Abstract:

Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.

Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query

Procedia PDF Downloads 203
1273 Applying Concurrent Development Process for the Web Using Aspect-Oriented Approach

Authors: Hiroaki Fukuda

Abstract:

This paper shows a concurrent development process for modern web application, called Rich Internet Application (RIA), and describes its effect using a non-trivial application development. In the last years, RIAs such as Ajax and Flex have become popular based mainly on high-speed network. RIA provides sophisticated interfaces and user experiences, therefore, the development of RIA requires two kinds of engineer: a developer who implements business logic, and a designer who designs interface and experiences. Although collaborative works are becoming important for the development of RIAs, shared resources such as source code make it difficult. For example, if a design of interface is modified after developers have finished business logic implementations, they need to repeat the same implementations, and also tests to verify application’s behavior. MVC architecture and Object-oriented programming (OOP) enables to dividing an application into modules such as interfaces and logic, however, developers and/or designers have to write pieces of code (e.g., event handlers) that make these modules work as an application. On the other hand, Aspect-oriented programming (AOP) is ex- pected to solve complexity of application software development nowadays. AOP provides methods to separate crosscutting concerns that are scattered pieces of code from primary concerns. In this paper, we provide a concurrent development process for RIAs by introducing AOP concept. This process makes it possible to reduce shared resources between developers and designers, therefore they can perform their tasks concurrently. In addition, we describe experiences of development for a practical application using our proposed development process to show its availability.

Keywords: aspect-oriented programming, concurrent, development process, rich internet application

Procedia PDF Downloads 301
1272 Expression Profiling and Immunohistochemical Analysis of Squamous Cell Carcinoma of Head and Neck (Tumor, Transition Zone, Normal) by Whole Genome Scale Sequencing

Authors: Veronika Zivicova, Petr Broz, Zdenek Fik, Alzbeta Mifkova, Jan Plzak, Zdenek Cada, Herbert Kaltner, Jana Fialova Kucerova, Hans-Joachim Gabius, Karel Smetana Jr.

Abstract:

The possibility to determine genome-wide expression profiles of cells and tissues opens a new level of analysis in the quest to define dysregulation in malignancy and thus identify new tumor markers. Toward this long-term aim, we here address two issues on this level for head and neck cancer specimen: i) defining profiles in different regions, i.e. the tumor, the transition zone and normal control and ii) comparing complete data sets for seven individual patients. Special focus in the flanking immunohistochemical part is given to adhesion/growth-regulatory galectins that upregulate chemo- and cytokine expression in an NF-κB-dependent manner, to these regulators and to markers of differentiation, i.e. keratins. The detailed listing of up- and down-regulations, also available in printed form (1), not only served to unveil new candidates for testing as marker but also let the impact of the tumor in the transition zone become apparent. The extent of interindividual variation raises a strong cautionary note on assuming uniformity of regulatory events, to be noted when considering therapeutic implications. Thus, a combination of test targets (and a network analysis for galectins and their downstream effectors) is (are) advised prior to reaching conclusions on further perspectives.

Keywords: galectins, genome scale sequencing, squamous cell carcinoma, transition zone

Procedia PDF Downloads 239
1271 Application of Neutron Stimulated Gamma Spectroscopy for Soil Elemental Analysis and Mapping

Authors: Aleksandr Kavetskiy, Galina Yakubova, Nikolay Sargsyan, Stephen A. Prior, H. Allen Torbert

Abstract:

Determining soil elemental content and distribution (mapping) within a field are key features of modern agricultural practice. While traditional chemical analysis is a time consuming and labor-intensive multi-step process (e.g., sample collections, transport to laboratory, physical preparations, and chemical analysis), neutron-gamma soil analysis can be performed in-situ. This analysis is based on the registration of gamma rays issued from nuclei upon interaction with neutrons. Soil elements such as Si, C, Fe, O, Al, K, and H (moisture) can be assessed with this method. Data received from analysis can be directly used for creating soil elemental distribution maps (based on ArcGIS software) suitable for agricultural purposes. The neutron-gamma analysis system developed for field application consisted of an MP320 Neutron Generator (Thermo Fisher Scientific, Inc.), 3 sodium iodide gamma detectors (SCIONIX, Inc.) with a total volume of 7 liters, 'split electronics' (XIA, LLC), a power system, and an operational computer. Paired with GPS, this system can be used in the scanning mode to acquire gamma spectra while traversing a field. Using acquired spectra, soil elemental content can be calculated. These data can be combined with geographical coordinates in a geographical information system (i.e., ArcGIS) to produce elemental distribution maps suitable for agricultural purposes. Special software has been developed that will acquire gamma spectra, process and sort data, calculate soil elemental content, and combine these data with measured geographic coordinates to create soil elemental distribution maps. For example, 5.5 hours was needed to acquire necessary data for creating a carbon distribution map of an 8.5 ha field. This paper will briefly describe the physics behind the neutron gamma analysis method, physical construction the measurement system, and main characteristics and modes of work when conducting field surveys. Soil elemental distribution maps resulting from field surveys will be presented. and discussed. Comparison of these maps with maps created on the bases of chemical analysis and soil moisture measurements determined by soil electrical conductivity was similar. The maps created by neutron-gamma analysis were reproducible, as well. Based on these facts, it can be asserted that neutron stimulated soil gamma spectroscopy paired with GPS system is fully applicable for soil elemental agricultural field mapping.

Keywords: ArcGIS mapping, neutron gamma analysis, soil elemental content, soil gamma spectroscopy

Procedia PDF Downloads 134
1270 Body Fluids Identification by Raman Spectroscopy and Matrix-Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry

Authors: Huixia Shi, Can Hu, Jun Zhu, Hongling Guo, Haiyan Li, Hongyan Du

Abstract:

The identification of human body fluids during forensic investigations is a critical step to determine key details, and present strong evidence to testify criminal in a case. With the popularity of DNA and improved detection technology, the potential question must be revolved that whether the suspect’s DNA derived from saliva or semen, menstrual or peripheral blood, how to identify the red substance or aged blood traces on the spot is blood; How to determine who contribute the right one in mixed stains. In recent years, molecular approaches have been developing increasingly on mRNA, miRNA, DNA methylation and microbial markers, but appear expensive, time-consuming, and destructive disadvantages. Physicochemical methods are utilized frequently such us scanning electron microscopy/energy spectroscopy and X-ray fluorescence and so on, but results only showing one or two characteristics of body fluid itself and that out of working in unknown or mixed body fluid stains. This paper focuses on using chemistry methods Raman spectroscopy and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry to discriminate species of peripheral blood, menstrual blood, semen, saliva, vaginal secretions, urine or sweat. Firstly, non-destructive, confirmatory, convenient and fast Raman spectroscopy method combined with more accurate matrix-assisted laser desorption/ionization time-of-flight mass spectrometry method can totally distinguish one from other body fluids. Secondly, 11 spectral signatures and specific metabolic molecules have been obtained by analysis results after 70 samples detected. Thirdly, Raman results showed peripheral and menstrual blood, saliva and vaginal have highly similar spectroscopic features. Advanced statistical analysis of the multiple Raman spectra must be requested to classify one to another. On the other hand, it seems that the lactic acid can differentiate peripheral and menstrual blood detected by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry, but that is not a specific metabolic molecule, more sensitivity ones will be analyzed in a forward study. These results demonstrate the great potential of the developed chemistry methods for forensic applications, although more work is needed for method validation.

Keywords: body fluids, identification, Raman spectroscopy, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry

Procedia PDF Downloads 137