Search results for: algorithm techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9867

Search results for: algorithm techniques

2877 Classification of Barley Varieties by Artificial Neural Networks

Authors: Alper Taner, Yesim Benal Oztekin, Huseyin Duran

Abstract:

In this study, an Artificial Neural Network (ANN) was developed in order to classify barley varieties. For this purpose, physical properties of barley varieties were determined and ANN techniques were used. The physical properties of 8 barley varieties grown in Turkey, namely thousand kernel weight, geometric mean diameter, sphericity, kernel volume, surface area, bulk density, true density, porosity and colour parameters of grain, were determined and it was found that these properties were statistically significant with respect to varieties. As ANN model, three models, N-l, N-2 and N-3 were constructed. The performances of these models were compared. It was determined that the best-fit model was N-1. In the N-1 model, the structure of the model was designed to be 11 input layers, 2 hidden layers and 1 output layer. Thousand kernel weight, geometric mean diameter, sphericity, kernel volume, surface area, bulk density, true density, porosity and colour parameters of grain were used as input parameter; and varieties as output parameter. R2, Root Mean Square Error and Mean Error for the N-l model were found as 99.99%, 0.00074 and 0.009%, respectively. All results obtained by the N-l model were observed to have been quite consistent with real data. By this model, it would be possible to construct automation systems for classification and cleaning in flourmills.

Keywords: physical properties, artificial neural networks, barley, classification

Procedia PDF Downloads 178
2876 Exploring the Connectedness of Ad Hoc Mesh Networks in Rural Areas

Authors: Ibrahim Obeidat

Abstract:

Reaching a fully-connected network of mobile nodes in rural areas got a great attention between network researchers. This attention rose due to the complexity and high costs while setting up the needed infrastructures for these networks, in addition to the low transmission range these nodes has. Terranet technology, as an example, employs ad-hoc mesh network where each node has a transmission range not exceed one kilometer, this means that every two nodes are able to communicate with each other if they are just one kilometer far from each other, otherwise a third-party will play the role of the “relay”. In Terranet, and as an idea to reduce network setup cost, every node in the network will be considered as a router that is responsible of forwarding data between other nodes which result in a decentralized collaborative environment. Most researches on Terranet presents the idea of how to encourage mobile nodes to become more cooperative by letting their devices in “ON” state as long as possible while accepting to play the role of relay (router). This research presents the issue of finding the percentage of nodes in ad-hoc mesh network within rural areas that should play the role of relay at every time slot, relating to what is the actual area coverage of nodes in order to have the network reach the fully-connectivity. Far from our knowledge, till now there is no current researches discussed this issue. The research is done by making an implementation that depends on building adjacency matrix as an indicator to the connectivity between network members. This matrix is continually updated until each value in it refers to the number of hubs that should be followed to reach from one node to another. After repeating the algorithm on different area sizes, different coverage percentages for each size, and different relay percentages for several times, results extracted shows that for area coverage less than 5% we need to have 40% of the nodes to be relays, where 10% percentage is enough for areas with node coverage greater than 5%.

Keywords: ad-hoc mesh networks, network connectivity, mobile ad-hoc networks, Terranet, adjacency matrix, simulator, wireless sensor networks, peer to peer networks, vehicular Ad hoc networks, relay

Procedia PDF Downloads 282
2875 The Backlift Technique among South African Cricket Players

Authors: Habib Noorbhai

Abstract:

This study primarily aimed to investigate the batting backlift technique (BBT) among semi-professional, professional and current international cricket players. A key question was to investigate if the lateral batting backlift technique (LBBT) is more common at the highest levels of the game. The participants in this study sample (n = 130) were South African semi-professional players (SP) (n = 69) and professional players (P) (n = 49) and South African international professional players (SAI) (n = 12). Biomechanical and video analysis were performed on all participant groups. Classifiers were utilised to identify the batting backlift technique type (BBTT) employed by all batsmen. All statistics and wagon wheels (scoring areas of the batsmen on a cricket field) were sourced online. This study found that a LBBT is more common at the highest levels of cricket batsmanship with batsmen at the various levels of cricket having percentages of the LBBT as follows: SP = 37.7%; P = 38.8%; SAI = 75%; p = 0.001. This study also found that SAI batsmen who used the LBBT were more proficient at scoring runs in various areas around the cricket field (according to the wagon wheel analysis). This study found that a LBBT is more common at the highest levels of cricket batsmanship. Cricket coaches should also pay attention to the direction of the backlift with players, especially when correlating the backlift to various scoring areas on the cricket field. Further in-depth research is required to fully investigate the change in batting backlift techniques among cricket players over a long-term period.

Keywords: cricket batting, biomechanical analysis, backlift, performance

Procedia PDF Downloads 260
2874 Evaluation and Provenance Studies of Heavy Mineral Deposits in Recent Sediment of Ologe Lagoon, South Western, Nigeria

Authors: Mayowa Philips Ibitola, Akinade-Solomon Olorunfemi, Abe Oluwaseun Banji

Abstract:

Heavy minerals studies were carried out on eighteen sediment samples from Ologe lagoon located at Lagos Barrier complex, with the aim of evaluating the heavy mineral deposits and determining the provenance of the sediments. The samples were subjected to grain analysis techniques in order to collect the finest grain size. Separation of heavy minerals from the samples was done with the aid of bromoform to enable petrographic analyses of the heavy mineral suite, under the polarising microscope. The data obtained from the heavy mineral analysis were used in preparing histograms and pie chart, from which the individual heavy mineral percentage distribution and ZTR index were derived. The percentage composition of the individual heavy mineral analyzed are opaque mineral 63.92%, Zircon 12.43%, Tourmaline 5.79%, Rutile 13.44%, Garnet 1.74% and Staurolite 3.52%. The calculated zircon, tourmaline, rutile index in percentage (ZTR) varied between 76.13 -92.15%, average garnet-zircon index (GZI), average rutile-zircon index (RuZI) and average staurolite-zircon index values in all the stations are 16.18%, 54.33%, 25.11% respectively. The mean ZTR index percentage value is 85.17% indicates that the sediments within the lagoon are mineralogically matured. The high percentage of zircon, rutile, and tourmaline indicates an acid igneous rock source for the sediments. However, the low percentage of staurolite, rutile and garnet occurrence indicates sediment of metamorphic rock source input.

Keywords: lagoon, provenance, heavy mineral, ZTR index

Procedia PDF Downloads 174
2873 Experimental Parameters’ Effects on the Electrical Discharge Machining Performances (µEDM)

Authors: Asmae Tafraouti, Yasmina Layouni, Pascal Kleimann

Abstract:

The growing market for Microsystems (MST) and Micro-Electromechanical Systems (MEMS) is driving the research for alternative manufacturing techniques to microelectronics-based technologies, which are generally expensive and time-consuming. Hot-embossing and micro-injection modeling of thermoplastics appear to be industrially viable processes. However, both require the use of master models, usually made in hard materials such as steel. These master models cannot be fabricated using standard microelectronics processes. Thus, other micromachining processes are used, as laser machining or micro-electrical discharge machining (µEDM). In this work, µEDM has been used. The principle of µEDM is based on the use of a thin cylindrical micro-tool that erodes the workpiece surface. The two electrodes are immersed in a dielectric with a distance of a few micrometers (gap). When an electrical voltage is applied between the two electrodes, electrical discharges are generated, which cause material machining. In order to produce master models with high resolution and smooth surfaces, it is necessary to well control the discharge mechanism. However, several problems are encountered, such as a random electrical discharge process, the fluctuation of the discharge energy, the electrodes' polarity inversion, and the wear of the micro-tool. The effect of different parameters, such as the applied voltage, the working capacitor, the micro-tool diameter, the initial gap, has been studied. This analysis helps to improve the machining performances, such: the workpiece surface condition and the lateral crater's gap.

Keywords: craters, electrical discharges, micro-electrical discharge machining (µEDM), microsystems

Procedia PDF Downloads 96
2872 Ultra-Rapid and Efficient Immunomagnetic Separation of Listeria Monocytogenes from Complex Samples in High-Gradient Magnetic Field Using Disposable Magnetic Microfluidic Device

Authors: L. Malic, X. Zhang, D. Brassard, L. Clime, J. Daoud, C. Luebbert, V. Barrere, A. Boutin, S. Bidawid, N. Corneau, J. Farber, T. Veres

Abstract:

The incidence of infections caused by foodborne pathogens such as Listeria monocytogenes (L. monocytogenes) poses a great potential threat to public health and safety. These issues are further exacerbated by legal repercussions due to “zero tolerance” food safety standards adopted in developed countries. Unfortunately, a large number of related disease outbreaks are caused by pathogens present in extremely low counts currently undetectable by available techniques. The development of highly sensitive and rapid detection of foodborne pathogens is therefore crucial, and requires robust and efficient pre-analytical sample preparation. Immunomagnetic separation is a popular approach to sample preparation. Microfluidic chips combined with external magnets have emerged as viable high throughput methods. However, external magnets alone are not suitable for the capture of nanoparticles, as very strong magnetic fields are required. Devices that incorporate externally applied magnetic field and microstructures of a soft magnetic material have thus been used for local field amplification. Unfortunately, very complex and costly fabrication processes used for integration of soft magnetic materials in the reported proof-of-concept devices would prohibit their use as disposable tools for food and water safety or diagnostic applications. We present a sample preparation magnetic microfluidic device implemented in low-cost thermoplastic polymers using fabrication techniques suitable for mass-production. The developed magnetic capture chip (M-chip) was employed for rapid capture and release of L. monocytogenes conjugated to immunomagnetic nanoparticles (IMNs) in buffer and beef filtrate. The M-chip relies on a dense array of Nickel-coated high-aspect ratio pillars for capture with controlled magnetic field distribution and a microfluidic channel network for sample delivery, waste, wash and recovery. The developed Nickel-coating process and passivation allows generation of switchable local perturbations within the uniform magnetic field generated with a pair of permanent magnets placed at the opposite edges of the chip. This leads to strong and reversible trapping force, wherein high local magnetic field gradients allow efficient capture of IMNs conjugated to L. monocytogenes flowing through the microfluidic chamber. The experimental optimization of the M-chip was performed using commercially available magnetic microparticles and fabricated silica-coated iron-oxide nanoparticles. The fabricated nanoparticles were optimized to achieve the desired magnetic moment and surface functionalization was tailored to allow efficient capture antibody immobilization. The integration, validation and further optimization of the capture and release protocol is demonstrated using both, dead and live L. monocytogenes through fluorescence microscopy and plate- culture method. The capture efficiency of the chip was found to vary as function of listeria to nanoparticle concentration ratio. The maximum capture efficiency of 30% was obtained and the 24-hour plate-culture method allowed the detection of initial sample concentration of only 16 cfu/ml. The device was also very efficient in concentrating the sample from a 10 ml initial volume. Specifically, 280% concentration efficiency was achieved in 17 minutes only, demonstrating the suitability of the system for food safety applications. In addition, flexible design and low-cost fabrication process will allow rapid sample preparation for applications beyond food and water safety, including point-of-care diagnosis.

Keywords: array of pillars, bacteria isolation, immunomagnetic sample preparation, polymer microfluidic device

Procedia PDF Downloads 281
2871 Development and Characterization of Synthetic Non-Woven for Sound Absorption

Authors: P. Sam Vimal Rajkumar, K. Priyanga

Abstract:

Acoustics is the scientific study of sound which includes the effect of reflection, refraction, absorption, diffraction and interference. Sound can be considered as a wave phenomenon. A sound wave is a longitudinal wave where particles of the medium are temporarily displaced in a direction parallel to energy transport and then return to their original position. The vibration in a medium produces alternating waves of relatively dense and sparse particles –compression and rarefaction respectively. The resultant variation to normal ambient pressure is translated by the ear and perceived as sound. Today much importance is given to the acoustical environment. The noise sources are increased day by day and annoying level is strongly violated in different locations by traffic, sound systems, and industries. There is simple evidence showing that the high noise levels cause sleep disturbance, hearing loss, decrease in productivity, learning disability, lower scholastic performance and increase in stress related hormones and blood pressure. Therefore, achieving a pleasing and noise free environment is one of the endeavours of many a research groups. This can be obtained by using various techniques. One such technique is by using suitable materials with good sound absorbing properties. The conventionally used materials that possess sound absorbing properties are rock wool or glass wool. In this work, an attempt is made to use synthetic material in both fibrous and sheet form and use it for manufacturing of non-woven for sound absorption.

Keywords: acoustics, fibre, non-woven, noise, sound absorption properties, sound absorption coefficient

Procedia PDF Downloads 301
2870 Entomological Study of Pests of Olive Trees in the Region of Batna - Algeria

Authors: Smail Chafaa, Abdelkrim Si Bachir

Abstract:

Our work aims to study the insect diversity based on bioclimatic levels of pests in olive cultures (Olea europea L.) in the area of Batna (arid and semi arid north eastern Algeria) during the period from January 2011 to May 2011. Several sampling techniques were used, those of hunting on sight, visual inspection, hatches traps, colored traps, Japanese umbrella and sweep net. We have identified in total, 2311 individuals with results in inventory 206 species divided to 74 families and 11 orders, including Coleoptera order is quantitatively the most represented with 47.1%. The most dominant diet in our inventory is the phytophagous. Between the herbivorous insects that we have listed and which are the main olive pest of olive cultivation; we quote the olive fly (Bactrocera oleae), cochineal purple olive (Parlatoria oleae) the psyllid olive (Euphyllura olivina) and olive Trips (Liothrips oleae). The distribution of species between stations shows that Boumia resort with the most number of species (113) compared to other resorts and beetles are also better represented in three groves. Total wealth is high in Boumia station compared with the others stations. The values of (H') exceeding 3.9 bits for all the stations studied indicate a specific wealth and diversity of ecological nests in insect species. The values of equitability are near the unit; that suggests a balance between the numbers of insect populations sampled in the various stations.

Keywords: entomology, olive, grove, batna, Algeria

Procedia PDF Downloads 343
2869 Effects of Cacao Agroforestry and Landscape Composition on Farm Biodiversity and Household Dietary Diversity

Authors: Marlene Yu Lilin Wätzold, Wisnu Harto Adiwijoyo, Meike Wollni

Abstract:

Land-use conversion from tropical forests to cash crop production in the form of monocultures has drastic consequences for biodiversity. Meanwhile, high dependence on cash crop production is often associated with a decrease in other food crop production, thereby affecting household dietary diversity. Additionally, deforestation rates have been found to reduce households’ dietary diversity, as forests often offer various food sources. Agroforestry systems are seen as a potential solution to improve local biodiversity as well as provide a range of provisioning ecosystem services, such as timber and other food crops. While a number of studies have analyzed the effects of agroforestry on biodiversity, as well as household livelihood indicators, little is understood between potential trade-offs or synergies between the two. This interdisciplinary study aims to fill this gap by assessing cacao agroforestry’s role in enhancing local bird diversity, as well as farm household dietary diversity. Additionally, we will take a landscape perspective and investigate in what ways the landscape composition, such as the proximity to forests and forest patches, are able to contribute to the local bird diversity, as well as households’ dietary diversity. Our study will take place in two agro-ecological zones in Ghana, based on household surveys of 500 cacao farm households. Using a subsample of 120 cacao plots, we will assess the degree of shade tree diversity and density using drone flights and a computer vision tree detection algorithm. Bird density and diversity will be assessed using sound recordings that will be kept in the cacao plots for 24 hours. Landscape compositions will be assessed via remote sensing images. The results of our study are of high importance as they will allow us to understand the effects of agroforestry and landscape composition in improving simultaneous ecosystem services.

Keywords: agroforestry, biodiversity, landscape composition, nutrition

Procedia PDF Downloads 113
2868 Magneto-Thermo-Mechanical Analysis of Electromagnetic Devices Using the Finite Element Method

Authors: Michael G. Pantelyat

Abstract:

Fundamental basics of pure and applied research in the area of magneto-thermo-mechanical numerical analysis and design of innovative electromagnetic devices (modern induction heaters, novel thermoelastic actuators, rotating electrical machines, induction cookers, electrophysical devices) are elaborated. Thus, mathematical models of magneto-thermo-mechanical processes in electromagnetic devices taking into account main interactions of interrelated phenomena are developed. In addition, graphical representation of coupled (multiphysics) phenomena under consideration is proposed. Besides, numerical techniques for nonlinear problems solution are developed. On this base, effective numerical algorithms for solution of actual problems of practical interest are proposed, validated and implemented in applied 2D and 3D computer codes developed. Many applied problems of practical interest regarding modern electrical engineering devices are numerically solved. Investigations of the influences of various interrelated physical phenomena (temperature dependences of material properties, thermal radiation, conditions of convective heat transfer, contact phenomena, etc.) on the accuracy of the electromagnetic, thermal and structural analyses are conducted. Important practical recommendations on the choice of rational structures, materials and operation modes of electromagnetic devices under consideration are proposed and implemented in industry.

Keywords: electromagnetic devices, multiphysics, numerical analysis, simulation and design

Procedia PDF Downloads 386
2867 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 274
2866 Community Interpreting in the Process of Asylum Seeking in Brazil

Authors: Fernanda Garcia

Abstract:

With the recent growth of refugees in the world, there has been an exponential increase in requests for asylum seeking in Brazil. When asylum seekers arrive in the country, the government initiates a process to evaluate the case, which will serve as grounds to determine the refugee status of the asylum seekers. During this process, an interview where the migrant has the chance to tell their story takes place. The aim of this article is to analyse how community interpreting is conducted in Brazil with regard to asylum seeking, as well as to analyse the role of the interpreter in the context of these official interviews to request refuge in Brazil. We investigate how the presence of an interpreter influences this interview, but more specifically, we study some of the linguistic techniques used by the interpreter in order to make the interaction more effective, as well as the challenges and difficulties they encounter during the interview. To do so, surveys with the interpreters took place, in addition to on-site observations. The interpreters involved in this research are volunteers as part of an extra-curricular extension programme from the University of Brasilia, in Brazil. Community Interpreting is a somewhat new field in Brazil, still facing several obstacles, such as the lack of professional community interpreters. This research illustrates some of these issues and, thus, has the potential to foster Brazilian literature in the matter as well as help understand the role of the interpreter in the interview to seek asylum in Brazil. The refugees’ situation in the world is certainly a pressing matter, and the language barrier is an issue of great importance. Hence, translation and interpretation studies have a fundamental role in this area, when it comes to contributing to a more inclusive world to those in need.

Keywords: asylum seeking, community interpreting, interviews, refugees

Procedia PDF Downloads 137
2865 Duality of Leagility and Governance: A New Normal Demand Network Management Paradigm under Pandemic

Authors: Jacky Hau

Abstract:

The prevalence of emerging technologies disrupts various industries as well as consumer behavior. Data collection has been in the fingertip and inherited through enabled Internet-of-things (IOT) devices. Big data analytics (BDA) becomes possible and allows real-time demand network management (DNM) through leagile supply chain. To enhance further on its resilience and predictability, governance is going to be examined to promote supply chain transparency and trust in an efficient manner. Leagility combines lean thinking and agile techniques in supply chain management. It aims at reducing costs and waste, as well as maintaining responsiveness to any volatile consumer demand by means of adjusting the decoupling point where the product flow changes from push to pull. Leagility would only be successful when collaborative planning, forecasting, and replenishment (CPFR) process or alike is in place throughout the supply chain business entities. Governance and procurement of the supply chain, however, is crucial and challenging for the execution of CPFR as every entity has to walk-the-talk generously for the sake of overall benefits of supply chain performance, not to mention the complexity of exercising the polices at both of within across various supply chain business entities on account of organizational behavior and mutual trust. Empirical survey results showed that the effective timespan on demand forecasting had been drastically shortening in the magnitude of months to weeks planning horizon, thus agility shall come first and preferably following by lean approach in a timely manner.

Keywords: governance, leagility, procure-to-pay, source-to-contract

Procedia PDF Downloads 111
2864 Hydrogeophysical Investigations of Groundwater Resources and Demarcation of Saltwater-Freshwater Interface in Kilwa Kisiwani Island, Se Tanzania

Authors: Simon R. Melchioly, Ibrahimu C. Mjemah, Isaac M. Marobhe

Abstract:

The main objective of this research was to identify new potential sources of groundwater resources using geophysical methods and also to demarcate the saltwater - freshwater interface. Kilwa Kisiwani Island geologically is covered mostly by Quaternary alluvial sediments, sand, and gravel. The geophysical techniques employed during the research include Vertical Electrical Sounding (VES), Earth Resistivity Tomography (ERT), and Transient Electromagnetics (TEM). Two-dimensional interpolated geophysical results show that there exist freshwater lenses formations that are potential aquifers on the Island with resistivity values ranging from 11.68 Ωm to 46.71 Ωm. These freshwater lenses are underlain by formation with brackish water in which the resistivity values are varying between 3.89 Ωm and 1.6 Ωm. Saltwater with resistivity less than 1 Ωm is found at the bottom being overlaid by brackish saturated formation. VES resistivity results show that 89% (16 out of 18) of the VES sites are potential for groundwater resources drilling while TEM results indicate that 75% (12 out of 16) of TEM sites are potential for groundwater borehole drilling. The recommended drilling depths for potential sites in Kilwa Kisiwani Island show that the maximum depth is 25 m and the minimum being 10 m below ground surface. The aquifer structure in Kilwa Kisiwani Island is a shallow, unconfined freshwater lenses floating above the seawater and the maximum thickness of the aquifer is 25 m for few selected VES and TEM sites while the minimum thickness being 10 m.

Keywords: groundwater, hydrogeophysical, Kilwa Kisiwani, freshwater, saltwater, resistivity

Procedia PDF Downloads 200
2863 Enhancement of Underwater Haze Image with Edge Reveal Using Pixel Normalization

Authors: M. Dhana Lakshmi, S. Sakthivel Murugan

Abstract:

As light passes from source to observer in the water medium, it is scattered by the suspended particulate matter. This scattering effect will plague the captured images with non-uniform illumination, blurring details, halo artefacts, weak edges, etc. To overcome this, pixel normalization with an Amended Unsharp Mask (AUM) filter is proposed to enhance the degraded image. To validate the robustness of the proposed technique irrespective of atmospheric light, the considered datasets are collected on dual locations. For those images, the maxima and minima pixel intensity value is computed and normalized; then the AUM filter is applied to strengthen the blurred edges. Finally, the enhanced image is obtained with good illumination and contrast. Thus, the proposed technique removes the effect of scattering called de-hazing and restores the perceptual information with enhanced edge detail. Both qualitative and quantitative analyses are done on considering the standard non-reference metric called underwater image sharpness measure (UISM), and underwater image quality measure (UIQM) is used to measure color, sharpness, and contrast for both of the location images. It is observed that the proposed technique has shown overwhelming performance compared to other deep-based enhancement networks and traditional techniques in an adaptive manner.

Keywords: underwater drone imagery, pixel normalization, thresholding, masking, unsharp mask filter

Procedia PDF Downloads 195
2862 Microdiamond and Moissanite Inclusions in Garnets from Pohorje Mountains, Eastern Alps, Slovenia

Authors: Mirijam Vrabec, Marian Janak, Bojan Ambrozic, Angelja K. Surca, Nastja Rogan Smuc, Nina Zupancic, Saso Sturm

Abstract:

Natural microdiamonds and moissanite (SiC) can form during the orogenic events under ultrahigh-pressure metamorphic conditions (UHP), when parts of Earth’s crust are subducted to extreme depths. So far, such processes were identified only in few places on the Earth, and therefore, represent unique opportunity to study the evolution of the Earth’s deep interior. An important discovery of microdiamonds and moissanite was reported from Pohorje, (Slovenia), where they occurred as single or polyphase inclusions in garnets. Metasedimentary rocks from Pohorje are predominantly gneisses representing parts of the Austroalpine metamorphic units of the Eastern Alps. During Cretaceous orogeny, (ca. 95–92 Ma) continental crustal rocks were deeply subducted to the mantle depths (below 100 km) and metamorphosed at pressures exceeding 3.5 GPa and temperatures between 800–850 °C. Microstructural and phase analysis of the inclusions as well as detailed elemental analysis of host garnets were carried out combining several analytical techniques: optical microscope in plane polarized transmitted light, electron probe microanalysis (EPMA) with wavelength-dispersive x-ray spectrometry (WDS) and field-emission scanning microscope (FEG-SEM) with energy-dispersive x-ray spectroscopy (EDS). Micro-Raman analysis revealed sharp, first order diamond bands sometimes accompanied by graphite bands implying that transformation of diamond back to graphite occurred. To study the chemical and crystallographic relationship between microdiamonds and co-inclusions, advanced techniques of transmission electron microscopy (TEM) were applied, which included high-angle annular dark-field scanning transmission electron microscopy (HAADF-STEM), combined with EDS and electron energy-loss spectroscopy (EELS). To prepare electron transparent TEM lamellae selectively a dual-beam Focused Ion Beam/SEM (FIB/SEM) was employed. Detailed study of TEM lamellae, which was cross-sectioned from the highly faceted inclusion body located within the host garnet crystal matrix, revealed rich and rather complex internal structure. Namely, the negative crystal facets of the main inclusion body were typically decorated with up to 1 μm thick amorphous layer, reflecting the general garnet composition with slight variations in Fe/Ca content. Within these layers, ELNES analysis revealed the presence of a 28–30 nm thick layer of amorphous carbon. The very last section of this layer corresponds to composition of SiO2. Within the inclusion, besides diamond and moissanite alumosilicate mineral with pronounced layered structure, iron sulfides and chlorine were identified under TEM and CO2 and CH4 using Raman. Moissanite is found as single crystal or composed from numerous highly textured nano-crystals with the average size of 10 nm. Moissanite inclusions were found embedded inside the amorphous crust implying that moissanite crystalized well before the deposition of the amorphous layer. From the microstructural, crystallographic and chemical observations so far we can deduce, that polyphase inclusions in diamond bearing garnets from Pohorje most probably crystallized from reduced supercritical fluids. Based on layered interface structure of the host mineral multiphase process of crystallization is possible. The presence of microdiamonds and moissanite in rocks from Pohorje demonstrates that these parts of the Eastern Alps were subducted to extreme depths, and were subsequently exhumed back to the Earth's surface without complete breakdown of UHP mineral phases, allowing a rear and exceptional opportunity to study them in-situ.

Keywords: diamond, fluid inclusions, moissanite, TEM, UHP metamorphism.

Procedia PDF Downloads 304
2861 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules

Authors: Michael Naderhirn, Marco Pavone

Abstract:

Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.

Keywords: formal system verification, reachability, real time controller, hybrid system

Procedia PDF Downloads 241
2860 Fabrication of Highly Stable Low-Density Self-Assembled Monolayers by Thiolyne Click Reaction

Authors: Leila Safazadeh, Brad Berron

Abstract:

Self-assembled monolayers have tremendous impact in interfacial science, due to the unique opportunity they offer to tailor surface properties. Low-density self-assembled monolayers are an emerging class of monolayers where the environment-interfacing portion of the adsorbate has a greater level of conformational freedom when compared to traditional monolayer chemistries. This greater range of motion and increased spacing between surface-bound molecules offers new opportunities in tailoring adsorption phenomena in sensing systems. In particular, we expect low-density surfaces to offer a unique opportunity to intercalate surface bound ligands into the secondary structure of protiens and other macromolecules. Additionally, as many conventional sensing surfaces are built upon gold surfaces (SPR or QCM), these surfaces must be compatible with gold substrates. Here, we present the first stable method of generating low-density self assembled monolayer surfaces on gold for the analysis of their interactions with protein targets. Our approach is based on the 2:1 addition of thiol-yne chemistry to develop new classes of y-shaped adsorbates on gold, where the environment-interfacing group is spaced laterally from neighboring chemical groups. This technique involves an initial deposition of a crystalline monolayer of 1,10 decanedithiol on the gold substrate, followed by grafting of a low-packed monolayer on through a photoinitiated thiol-yne reaction in presence of light. Orthogonality of the thiol-yne chemistry (commonly referred to as a click chemistry) allows for preparation of low-density monolayers with variety of functional groups. To date, carboxyl, amine, alcohol, and alkyl terminated monolayers have been prepared using this core technology. Results from surface characterization techniques such as FTIR, contact angle goniometry and electrochemical impedance spectroscopy confirm the proposed low chain-chain interactions of the environment interfacing groups. Reductive desorption measurements suggest a higher stability for the click-LDMs compared to traditional SAMs, along with the equivalent packing density at the substrate interface, which confirms the proposed stability of the monolayer-gold interface. In addition, contact angle measurements change in the presence of an applied potential, supporting our description of a surface structure which allows the alkyl chains to freely orient themselves in response to different environments. We are studying the differences in protein adsorption phenomena between well packed and our loosely packed surfaces, and we expect this data will be ready to present at the GRC meeting. This work aims to contribute biotechnology science in the following manner: Molecularly imprinted polymers are a promising recognition mode with several advantages over natural antibodies in the recognition of small molecules. However, because of their bulk polymer structure, they are poorly suited for the rapid diffusion desired for recognition of proteins and other macromolecules. Molecularly imprinted monolayers are an emerging class of materials where the surface is imprinted, and there is not a bulk material to impede mass transfer. Further, the short distance between the binding site and the signal transduction material improves many modes of detection. My dissertation project is to develop a new chemistry for protein-imprinted self-assembled monolayers on gold, for incorporation into SPR sensors. Our unique contribution is the spatial imprinting of not only physical cues (seen in current imprinted monolayer techniques), but to also incorporate complementary chemical cues. This is accomplished through a photo-click grafting of preassembled ligands around a protein template. This conference is important for my development as a graduate student to broaden my appreciation of the sensor development beyond surface chemistry.

Keywords: low-density self-assembled monolayers, thiol-yne click reaction, molecular imprinting

Procedia PDF Downloads 226
2859 Detecting Nitrogen Deficiency and Potato Leafhopper (Hemiptera, Cicadellidae) Infestation in Green Bean Using Multispectral Imagery from Unmanned Aerial Vehicle

Authors: Bivek Bhusal, Ana Legrand

Abstract:

Detection of crop stress is one of the major applications of remote sensing in agriculture. Multiple studies have demonstrated the capability of remote sensing using Unmanned Aerial Vehicle (UAV)-based multispectral imagery for detection of plant stress, but none so far on Nitrogen (N) stress and PLH feeding stress on green beans. In view of its wide host range, geographical distribution, and damage potential, Potato leafhopper- Empoasca fabae (Harris) has been emerging as a key pest in several countries. Monitoring methods for potato leafhopper (PLH) damage, as well as the laboratory techniques for detecting Nitrogen deficiency, are time-consuming and not always easily affordable. A study was initiated to demonstrate if the multispectral sensor attached to a drone can detect PLH stress and N deficiency in beans. Small-plot trials were conducted in the summer of 2023, where cages were used to manipulate PLH infestation in green beans (Provider cultivar) at their first-trifoliate stage. Half of the bean plots were introduced with PLH, and the others were kept insect-free. Half of these plots were grown with the recommended amount of N, and the others were grown without N. Canopy reflectance was captured using a five-band multispectral sensor. Our findings indicate that drone imagery could detect stress due to a lack of N and PLH damage in beans.

Keywords: potato leafhopper, nitrogen, remote sensing, spectral reflectance, beans

Procedia PDF Downloads 60
2858 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 495
2857 Software Vulnerability Markets: Discoverers and Buyers

Authors: Abdullah M. Algarni, Yashwant K. Malaiya

Abstract:

Some of the key aspects of vulnerability-discovery, dissemination, and disclosure-have received some attention recently. However, the role of interaction among the vulnerability discoverers and vulnerability acquirers has not yet been adequately addressed. Our study suggests that a major percentage of discoverers, a majority in some cases, are unaffiliated with the software developers and thus are free to disseminate the vulnerabilities they discover in any way they like. As a result, multiple vulnerability markets have emerged. In some of these markets, the exchange is regulated, but in others, there is little or no regulation. In recent vulnerability discovery literature, the vulnerability discoverers have remained anonymous individuals. Although there has been an attempt to model the level of their efforts, information regarding their identities, modes of operation, and what they are doing with the discovered vulnerabilities has not been explored. Reports of buying and selling of the vulnerabilities are now appearing in the press; however, the existence of such markets requires validation, and the natures of the markets need to be analysed. To address this need, we have attempted to collect detailed information. We have identified the most prolific vulnerability discoverers throughout the past decade and examined their motivation and methods. A large percentage of these discoverers are located in Eastern and Western Europe and in the Far East. We have contacted several of them in order to collect first-hand information regarding their techniques, motivations, and involvement in the vulnerability markets. We examine why many of the discoverers appear to retire after a highly successful vulnerability-finding career. The paper identifies the actual vulnerability markets, rather than the hypothetical ideal markets that are often examined. The emergence of worldwide government agencies as vulnerability buyers has significant implications. We discuss potential factors that can impact the risk to society and the need for detailed exploration.

Keywords: risk management, software security, vulnerability discoverers, vulnerability markets

Procedia PDF Downloads 253
2856 Assessing Suitability of Earthbag Technology for Temporary Housing: Sustainability Challenge

Authors: S. M. Amin Hosseini, Ana Blanco, Albert De La Fuente, Sergio Cavalaro

Abstract:

In emergency situations, it is fundamental to provide with a safe shelter to the population affected. However, the lack of resources and short time often represent a barrier difficult to overcome. A sustainable, rapid and low-cost construction technique is earthbag construction. This technique has spread as an alternative to the construction of emergency shelter, social housing, and even ecovillages. The earthbag construction consists of introducing soil in degradable bags that are stacked to form adobe structures. The present study aims to assess characteristics of the earthbag construction technique based on sustainability requirements and features of other methods used for temporary housing. In this case, after defining the sustainability criteria and emergency situation necessities, this study compares earthbag construction with other types of prefabricated temporary housing. Finally, the most suitable conditions for applying this technique based on the particular local properties and second life scenarios of superadobe temporary housing. The results of the study contribute to promote the earthbag and superadobe techniques as sustainable alternatives for temporary housing. However, the sustainability index of this technology highly depends on affected local conditions and characteristics. Consequently, in order to achieve a high sustainability index, emergency managers need to decide about this technology based on the highlighted results of this study, attention to the importance of specific local conditions and next functions of temporary housing.

Keywords: temporary housing, temporary shelter, earthbag, superadobe, sustainability, emergency

Procedia PDF Downloads 229
2855 Comparison of Physicochemical Properties of DNA-Ionic Liquids Complexes

Authors: Ewelina Nowak, Anna Wisla-Swider, Gohar Khachatryan, Krzysztof Danel

Abstract:

Complexes of ionic liquids with different heterocyclic-rings were synthesized by ion exchange reactions with pure salmon DNA. Ionic liquids (ILs) like 1-hexyl-3-methylimidazolium chloride, 1-butyl-4-methylpyridinium chloride and 1-ethyl-1-methylpyrrolidinium bromide were used. The ILs were built into helical state and confirmed by IR spectrometric techniques. Patterns of UV-Vis, photoluminescence, IR, and CD spectra indicated inclusion of small molecules into DNA structure. Molecular weight and radii of gyrations values of ILs-DNA complexes chains were established by HPSEC–MALLS–RI method. Modification DNA with 1-ethyl-1-methylpyrrolidinium bromide gives more uniform material and leads to elimination of high molecular weight chains. Thus, the incorporation DNA double helical structure with both 1-hexyl-3-methylimidazolium chloride and 1-butyl-4-methylpyridinium chloride exhibited higher molecular weight values. Scanning electron microscopy images indicate formation of nanofibre structures in all DNA complexes. Fluorescence depends strongly on the environment in which the chromophores are inserted and simultaneously on the molecular interactions with the biopolymer matrix. The most intensive emission was observed for DNA-imidazole ring complex. Decrease in intensity UV-Vis peak absorption is a consequence of a reduction in the spatial order of polynucleotide strands and provides different π–π stacking structure. Changes in optical properties confirmed by spectroscopy methods make DNA-ILs complexes potential biosensor applications.

Keywords: biopolymers, biosensors, cationic surfactant, DNA, DNA-gels

Procedia PDF Downloads 183
2854 Effects of Process Parameters on the Yield of Oil from Coconut Fruit

Authors: Ndidi F. Amulu, Godian O. Mbah, Maxwel I. Onyiah, Callistus N. Ude

Abstract:

Analysis of the properties of coconut (Cocos nucifera) and its oil was evaluated in this work using standard analytical techniques. The analyses carried out include proximate composition of the fruit, extraction of oil from the fruit using different process parameters and physicochemical analysis of the extracted oil. The results showed the percentage (%) moisture, crude lipid, crude protein, ash, and carbohydrate content of the coconut as 7.59, 55.15, 5.65, 7.35, and 19.51 respectively. The oil from the coconut fruit was odourless and yellowish liquid at room temperature (30oC). The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant differences (P˂0.05) in the yield of oil from coconut flour. The oil yield ranged between 36.25%-49.83%. Lipid indices of the coconut oil indicated the acid value (AV) as 10.05 Na0H/g of oil, free fatty acid (FFA) as 5.03%, saponification values (SV) as 183.26 mgKOH-1 g of oil, iodine value (IV) as 81.00 I2/g of oil, peroxide value (PV) as 5.00 ml/ g of oil and viscosity (V) as 0.002. A standard statistical package minitab version 16.0 program was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to generate various plots such as single effect plot, interactions effect plot and contour plot. The response or yield of oil from the coconut flour was used to develop a mathematical model that correlates the yield to the process variables studied. The maximum conditions obtained that gave the highest yield of coconut oil were leaching time of 2 hrs, leaching temperature of 50 oC and solute/solvent ratio of 0.05 g/ml.

Keywords: coconut, oil-extraction, optimization, physicochemical, proximate

Procedia PDF Downloads 354
2853 A Case Study of Clinicians’ Perceptions of Enterprise Content Management at Tygerberg Hospital

Authors: Temitope O. Tokosi

Abstract:

Healthcare is a human right. The sensitivity of health issues has necessitated the introduction of Enterprise Content Management (ECM) at district hospitals in the Western Cape Province of South Africa. The objective is understanding clinicians’ perception of ECM at their workplace. It is a descriptive case study design of constructivist paradigm. It employed a phenomenological data analysis method using a pattern matching deductive based analytical procedure. Purposive and s4nowball sampling techniques were applied in selecting participants. Clinicians expressed concerns and frustrations using ECM such as, non-integration with other hospital systems. Inadequate access points to ECM. Incorrect labelling of notes and bar-coding causes more time wasted in finding information. System features and/or functions (such as search and edit) are not possible. Hospital management and clinicians are not constantly interacting and discussing. Information turnaround time is unacceptably lengthy. Resolving these problems would involve a positive working relationship between hospital management and clinicians. In addition, prioritising the problems faced by clinicians in relation to relevance can ensure problem-solving in order to meet clinicians’ expectations and hospitals’ objective. Clinicians’ perception should invoke attention from hospital management with regards technology use. The study’s results can be generalised across clinician groupings exposed to ECM at various district hospitals because of professional and hospital homogeneity.

Keywords: clinician, electronic content management, hospital, perception, technology

Procedia PDF Downloads 233
2852 Optimization of Manufacturing Process Parameters: An Empirical Study from Taiwan's Tech Companies

Authors: Chao-Ton Su, Li-Fei Chen

Abstract:

The parameter design is crucial to improving the uniformity of a product or process. In the product design stage, parameter design aims to determine the optimal settings for the parameters of each element in the system, thereby minimizing the functional deviations of the product. In the process design stage, parameter design aims to determine the operating settings of the manufacturing processes so that non-uniformity in manufacturing processes can be minimized. The parameter design, trying to minimize the influence of noise on the manufacturing system, plays an important role in the high-tech companies. Taiwan has many well-known high-tech companies, which show key roles in the global economy. Quality remains the most important factor that enables these companies to sustain their competitive advantage. In Taiwan however, many high-tech companies face various quality problems. A common challenge is related to root causes and defect patterns. In the R&D stage, root causes are often unknown, and defect patterns are difficult to classify. Additionally, data collection is not easy. Even when high-volume data can be collected, data interpretation is difficult. To overcome these challenges, high-tech companies in Taiwan use more advanced quality improvement tools. In addition to traditional statistical methods and quality tools, the new trend is the application of powerful tools, such as neural network, fuzzy theory, data mining, industrial engineering, operations research, and innovation skills. In this study, several examples of optimizing the parameter settings for the manufacturing process in Taiwan’s tech companies will be presented to illustrate proposed approach’s effectiveness. Finally, a discussion of using traditional experimental design versus the proposed approach for process optimization will be made.

Keywords: quality engineering, parameter design, neural network, genetic algorithm, experimental design

Procedia PDF Downloads 145
2851 Comparative Study on Inhibiting Factors of Cost and Time Control in Nigerian Construction Practice

Authors: S. Abdulkadir, I. Y. Moh’d, S. U. Kunya, U. Nuruddeen

Abstract:

The basis of any contract formation between the client and contractor is the budgeted cost and the estimated duration of projects. These variables are paramount important to project's sponsor in a construction projects and in assessing the success or viability of construction projects. Despite the availability of various techniques of cost and time control, many projects failed to achieve their initial estimated cost and time. The paper evaluate the inhibiting factors of cost and time control in Nigerian construction practice and comparing the result with the United Kingdom practice as identified by one researcher. The populations of the study are construction professionals within Bauchi and Gombe state, Nigeria, a judgmental sampling employed in determining the size of respondents. Descriptive statistics used in analyzing the data in SPSS. Design change, project fraud and corruption, financing and payment of completed work found to be common among the top five inhibiting factors of cost and time control in the study area. Furthermore, the result had shown some comprising with slight contrast as in the case of United Kingdom practice. Study recommend the adaptation of mitigation measures developed in the UK prior to assessing its effectiveness and so also developing a mitigating measure for other top factors that are not within the one developed in United Kingdom practice. Also, it recommends a wider assessing comparison on the modify inhibiting factors of cost and time control as revealed by the study to cover almost all part of Nigeria.

Keywords: comparison, cost, inhibiting factor, United Kingdom, time

Procedia PDF Downloads 440
2850 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis

Authors: Anisul Islam

Abstract:

The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.

Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china

Procedia PDF Downloads 64
2849 Size Optimization of Microfluidic Polymerase Chain Reaction Devices Using COMSOL

Authors: Foteini Zagklavara, Peter Jimack, Nikil Kapur, Ozz Querin, Harvey Thompson

Abstract:

The invention and development of the Polymerase Chain Reaction (PCR) technology have revolutionised molecular biology and molecular diagnostics. There is an urgent need to optimise their performance of those devices while reducing the total construction and operation costs. The present study proposes a CFD-enabled optimisation methodology for continuous flow (CF) PCR devices with serpentine-channel structure, which enables the trade-offs between competing objectives of DNA amplification efficiency and pressure drop to be explored. This is achieved by using a surrogate-enabled optimisation approach accounting for the geometrical features of a CF μPCR device by performing a series of simulations at a relatively small number of Design of Experiments (DoE) points, with the use of COMSOL Multiphysics 5.4. The values of the objectives are extracted from the CFD solutions, and response surfaces created using the polyharmonic splines and neural networks. After creating the respective response surfaces, genetic algorithm, and a multi-level coordinate search optimisation function are used to locate the optimum design parameters. Both optimisation methods produced similar results for both the neural network and the polyharmonic spline response surfaces. The results indicate that there is the possibility of improving the DNA efficiency by ∼2% in one PCR cycle when doubling the width of the microchannel to 400 μm while maintaining the height at the value of the original design (50μm). Moreover, the increase in the width of the serpentine microchannel is combined with a decrease in its total length in order to obtain the same residence times in all the simulations, resulting in a smaller total substrate volume (32.94% decrease). A multi-objective optimisation is also performed with the use of a Pareto Front plot. Such knowledge will enable designers to maximise the amount of DNA amplified or to minimise the time taken throughout thermal cycling in such devices.

Keywords: PCR, optimisation, microfluidics, COMSOL

Procedia PDF Downloads 161
2848 Supersonic Flow around a Dihedral Airfoil: Modeling and Experimentation Investigation

Authors: A. Naamane, M. Hasnaoui

Abstract:

Numerical modeling of fluid flows, whether compressible or incompressible, laminar or turbulent presents a considerable contribution in the scientific and industrial fields. However, the development of an approximate model of a supersonic flow requires the introduction of specific and more precise techniques and methods. For this purpose, the object of this paper is modeling a supersonic flow of inviscid fluid around a dihedral airfoil. Based on the thin airfoils theory and the non-dimensional stationary Steichen equation of a two-dimensional supersonic flow in isentropic evolution, we obtained a solution for the downstream velocity potential of the oblique shock at the second order of relative thickness that characterizes a perturbation parameter. This result has been dealt with by the asymptotic analysis and characteristics method. In order to validate our model, the results are discussed in comparison with theoretical and experimental results. Indeed, firstly, the comparison of the results of our model has shown that they are quantitatively acceptable compared to the existing theoretical results. Finally, an experimental study was conducted using the AF300 supersonic wind tunnel. In this experiment, we have considered the incident upstream Mach number over a symmetrical dihedral airfoil wing. The comparison of the different Mach number downstream results of our model with those of the existing theoretical data (relative margin between 0.07% and 4%) and with experimental results (concordance for a deflection angle between 1° and 11°) support the validation of our model with accuracy.

Keywords: asymptotic modelling, dihedral airfoil, supersonic flow, supersonic wind tunnel

Procedia PDF Downloads 134