Search results for: field data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30395

Search results for: field data

28685 Optimization of SOL-Gel Copper Oxide Layers for Field-Effect Transistors

Authors: Tomas Vincze, Michal Micjan, Milan Pavuk, Martin Weis

Abstract:

In recent years, alternative materials are gaining attention to replace polycrystalline and amorphous silicon, which are a standard for low requirement devices, where silicon is unnecessarily and high cost. For that reason, metal oxides are envisioned as the new materials for these low-requirement applications such as sensors, solar cells, energy storage devices, or field-effect transistors. Their most common way of layer growth is sputtering; however, this is a high-cost fabrication method, and a more industry-suitable alternative is the sol-gel method. In this group of materials, many oxides exhibit a semiconductor-like behavior with sufficiently high mobility to be applied as transistors. The sol-gel method is a cost-effective deposition technique for semiconductor-based devices. Copper oxides, as p-type semiconductors with free charge mobility up to 1 cm2/Vs., are suitable replacements for poly-Si or a-Si:H devices. However, to reach the potential of silicon devices, a fine-tuning of material properties is needed. Here we focus on the optimization of the electrical parameters of copper oxide-based field-effect transistors by modification of precursor solvent (usually 2-methoxy ethanol). However, to achieve solubility and high-quality films, a better solvent is required. Since almost no solvents have both high dielectric constant and high boiling point, an alternative approach was proposed with blend solvents. By mixing isopropyl alcohol (IPA) and 2-methoxy ethanol (2ME) the precursor reached better solubility. The quality of the layers fabricated using mixed solutions was evaluated in accordance with the surface morphology and electrical properties. The IPA:2ME solution mixture reached optimum results for the weight ratio of 1:3. The cupric oxide layers for optimal mixture had the highest crystallinity and highest effective charge mobility.

Keywords: copper oxide, field-effect transistor, semiconductor, sol-gel method

Procedia PDF Downloads 124
28684 Run-Time Customisation of Soft-Core CPUs on Field Programmable Gate Array

Authors: Rehab Abdullah Shendi

Abstract:

The use of customised soft-core processors in which instructions can be integrated into a system in application hardware is increasing in the Field Programmable Gate Array (FPGA) field. Specifically, the partial run-time reconfiguration of FPGAs in specialised processors for a particular domain can be very beneficial. In this report, the design and implementation for the customisation of a soft-core MIPS processor using an FPGA and partial reconfiguration (PR) of FPGA technology will be addressed to achieve efficient resource use. This can be achieved using a PR design flow that helps the design fit into a smaller device. Moreover, the impact of static power consumption could be reduced due to runtime reconfiguration. This will be done by configurable custom instructions implemented in the hardware as an extension on the MIPS CPU. The aim of this project is to investigate the PR of FPGAs for run-time adaptations of the instruction set of a soft-core CPU, including the integration of custom instructions and the exploration of the potential to use the MultiBoot feature available in Xilinx FPGAs to carry out the PR process. The system will be evaluated and tested on a Nexus 3 development board featuring a Xilinx Spartran-6 FPGA. The system will be able to load reconfigurable custom instructions dynamically into user programs with the help of the trap handler when the custom instruction is called by the MIPS CPU. The results of this experiment demonstrate that custom instructions in hardware can speed up a certain function and many instructions can be saved when compared to a software implementation of the same function. Implementing custom instructions in hardware is perfectly possible and worth exploring.

Keywords: customisation, FPGA, MIPS, partial reconfiguration, PR

Procedia PDF Downloads 259
28683 MWCNT/CuFe10Al2O19/Polyanilie Nanocomposite for Microwave Absorbing Applications

Authors: Pallab Bhattacharya, C. K. Das

Abstract:

Development of microwave absorbing material is a growing field of research in both the commercial and defense sector, and also to enrich the field of stealth technology. The recent work is attentive to the preparation of nanocomposite based on acid modified MWCNT, hexagonal shaped magnetic M-type hexaferrite (CuFe10Al2O19) and polyaniline. CuFe10Al2O19 was prepared by a facile chemical co-precipitation method. An in-situ approach was employed for the coating of polyaniline on MWCNT/CuFe10Al2O19 nanocomposite. The final fabrication of this nanocomposite for microwave measurements was done suitably in the matrix of thermoplastic polyurethane with 10% filler content. The nanocomposites showed the maximum reflection loss of -60.2 dB (in X-band) at the thickness of 2.5 mm with a broad absorption range in contrast to the pristine MWCNT and CuFe10Al2O19. Addition of PANI improves the microwave absorption property of the nanocomposites. The thermal stability of the prepared nanocomposites is also very high.

Keywords: magnetic materials, microwave absorption, MWCNT, nanocomposites

Procedia PDF Downloads 290
28682 WiFi Data Offloading: Bundling Method in a Canvas Business Model

Authors: Majid Mokhtarnia, Alireza Amini

Abstract:

Mobile operators deal with increasing in the data traffic as a critical issue. As a result, a vital responsibility of the operators is to deal with such a trend in order to create added values. This paper addresses a bundling method in a Canvas business model in a WiFi Data Offloading (WDO) strategy by which some elements of the model may be affected. In the proposed method, it is supposed to sell a number of data packages for subscribers in which there are some packages with a free given volume of data-offloaded WiFi complimentary. The paper on hands analyses this method in the views of attractiveness and profitability. The results demonstrate that the quality of implementation of the WDO strongly affects the final result and helps the decision maker to make the best one.

Keywords: bundling, canvas business model, telecommunication, WiFi data offloading

Procedia PDF Downloads 193
28681 Turbulent Channel Flow Synthesis using Generative Adversarial Networks

Authors: John M. Lyne, K. Andrea Scott

Abstract:

In fluid dynamics, direct numerical simulations (DNS) of turbulent flows require large amounts of nodes to appropriately resolve all scales of energy transfer. Due to the size of these databases, sharing these datasets amongst the academic community is a challenge. Recent work has been done to investigate the use of super-resolution to enable database sharing, where a low-resolution flow field is super-resolved to high resolutions using a neural network. Recently, Generative Adversarial Networks (GAN) have grown in popularity with impressive results in the generation of faces, landscapes, and more. This work investigates the generation of unique high-resolution channel flow velocity fields from a low-dimensional latent space using a GAN. The training objective of the GAN is to generate samples in which the distribution of the generated samplesis ideally indistinguishable from the distribution of the training data. In this study, the network is trained using samples drawn from a statistically stationary channel flow at a Reynolds number of 560. Results show that the turbulent statistics and energy spectra of the generated flow fields are within reasonable agreement with those of the DNS data, demonstrating that GANscan produce the intricate multi-scale phenomena of turbulence.

Keywords: computational fluid dynamics, channel flow, turbulence, generative adversarial network

Procedia PDF Downloads 200
28680 Essential Elements and Trace Metals on a Continuously Cultivated and Fertilised Field

Authors: Pholosho M. Kgopa, Phatu W. Mashela

Abstract:

Due to high incidents of marginal land in Limpopo Province, South Africa, and increasing demand for arable land, small-holder farmers tend to continuously cultivate the same fields and at the same time, applying fertilisers to improve yields for meeting local food security. These practices might have an impact on the distribution of trace and essential elements. Therefore, the objective of this investigation was to assess the distribution of essential elements and trace metals in a continuously cultivated and fertilised field, at the University of Limpopo Experimental Farm. Three fields, 3 ha each were identified as continuously cultivated (CC), moderately cultivated (MC) and virgin fields (VF). Each field was divided into 12 equal grids of 50 m × 50 m for sampling. A soil profile was opened in each grid, where soil samples were collected from 0-20; 20-40 and 40-60; 60-80 and 80-100 cm depths for analysis. Samples were analysed for soil texture, pH, electrical conductivity, organic matter content, selected essential elements (Ca, P and Mg), Na and trace elements (Cu, Fe, Ni, and Zn). Results suggested that most of the variables were vertically different, with high concentrations of the test elements except for magnesium. Soil pH in depth 0-20 cm was high (6.44) in CC when compared to that in VF (5.29), but lower than that of MC (7.84). There were no distinctive vertical trends of the variables, except for Mg, Na, and K which displayed a declining trend at 40-60 cm depth when compared to the 0-20 cm depth. Concentrations of Fe, Cu, Zn, and Ni were generally low which might be due to their indirect relationship with soil pH. Continuous cultivation and fertilisation altered soil chemical properties; which could explain the unproductivity of such fields.

Keywords: over-cultivation, soil chemical properties, vertical distribution, spatial distribution

Procedia PDF Downloads 183
28679 Researching Apache Hama: A Pure BSP Computing Framework

Authors: Kamran Siddique, Yangwoo Kim, Zahid Akhtar

Abstract:

In recent years, the technological advancements have led to a deluge of data from distinctive domains and the need for development of solutions based on parallel and distributed computing has still long way to go. That is why, the research and development of massive computing frameworks is continuously growing. At this particular stage, highlighting a potential research area along with key insights could be an asset for researchers in the field. Therefore, this paper explores one of the emerging distributed computing frameworks, Apache Hama. It is a Top Level Project under the Apache Software Foundation, based on Bulk Synchronous Processing (BSP). We present an unbiased and critical interrogation session about Apache Hama and conclude research directions in order to assist interested researchers.

Keywords: apache hama, bulk synchronous parallel, BSP, distributed computing

Procedia PDF Downloads 243
28678 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks

Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam

Abstract:

In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.

Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion

Procedia PDF Downloads 111
28677 An Assessment of Housing Affordability and Safety Measures in the Varied Residential Area of Lagos, A Case Study of the Amuwo-Odofin Local Government Area in Lagos State

Authors: Jubril Olatunbosun Akinde

Abstract:

Unplanned population growth are mostly attributed to a lack of infrastructural facilities and poor economic condition in the rural dwellings and the incidence of rural-urban migration, which has resulted in severe housing deficiency in the urban centre, with a resultant pressure on housing delivery in the cities. Affordable housing does not only encompass environmental factors that make living acceptable and comfortable, which include good access routes, ventilation, sanitation and access to other basic human needs, which include water and safety. The research assessed the housing affordability and safety measures in the varied residential area of lagos by examining the demographic and socioeconomic attributes of residents; examining the existing residential safety measures; by examining the residential quality in terms of safety; the researcher therefore examined if relationship between housing affordability and safety in the varied residential areas. The research adopted the bartlett, kotrlik and higgins (2001) method of t-test to determine the sample size which specifies different populations at different levels of significance (α). The researcher adopted primary data which was sourced from a field survey where the sample population was simply randomly selected to give a member of the population an equal chance of being selected, therefore, the sample size for the field survey was two hundred (200) respondents, and subjected to necessary testing. The research come to conclusion that housing safety and security is the responsibility of every resident, the landlords/landladies possess a better sense of security in their neighbourhood than renters in the community, therefore they need to be aware of their responsibility of ensuring the safety of lives and property.

Keywords: housing, housing affordability, housing security, residential, residential quality

Procedia PDF Downloads 105
28676 Real Time Video Based Smoke Detection Using Double Optical Flow Estimation

Authors: Anton Stadler, Thorsten Ike

Abstract:

In this paper, we present a video based smoke detection algorithm based on TVL1 optical flow estimation. The main part of the algorithm is an accumulating system for motion angles and upward motion speed of the flow field. We optimized the usage of TVL1 flow estimation for the detection of smoke with very low smoke density. Therefore, we use adapted flow parameters and estimate the flow field on difference images. We show in theory and in evaluation that this improves the performance of smoke detection significantly. We evaluate the smoke algorithm using videos with different smoke densities and different backgrounds. We show that smoke detection is very reliable in varying scenarios. Further we verify that our algorithm is very robust towards crowded scenes disturbance videos.

Keywords: low density, optical flow, upward smoke motion, video based smoke detection

Procedia PDF Downloads 347
28675 Knowledge Discovery and Data Mining Techniques in Textile Industry

Authors: Filiz Ersoz, Taner Ersoz, Erkin Guler

Abstract:

This paper addresses the issues and technique for textile industry using data mining techniques. Data mining has been applied to the stitching of garments products that were obtained from a textile company. Data mining techniques were applied to the data obtained from the CHAID algorithm, CART algorithm, Regression Analysis and, Artificial Neural Networks. Classification technique based analyses were used while data mining and decision model about the production per person and variables affecting about production were found by this method. In the study, the results show that as the daily working time increases, the production per person also decreases. In addition, the relationship between total daily working and production per person shows a negative result and the production per person show the highest and negative relationship.

Keywords: data mining, textile production, decision trees, classification

Procedia PDF Downloads 344
28674 European Project Meter Matters in Sports: Fostering Criteria for Inclusion through Sport

Authors: Maria Campos, Alain Massart, Hugo Sarmento

Abstract:

The Meter Matters Erasmus Sport European Project (ID: 101050372) explores the field of social inclusion in and through sports with the aim of a) proposing appropriate criteria for co-funding sports programs involving people with intellectual and developmental disabilities and other more vulnerable people, primarily in mainstream sports organizations and b) proposing a model for co-funding social inclusion in and through sports at the national level. This European project (2022-2024) involves 6 partners from 3 countries: Univerza V Ljubljani – coordinator and Drustvo Specialna Olimpiada Slovenije (Slovenia); Magyar Specialis Olimpia Szovetseg and Magyar Testnevelesi Es Sporttudomanyi Egyetem (Hungary) and APPDA Coimbra - Associação Portuguesa para as Perturbações do Desenvolvimento e Autismo and Universidade De Coimbra, Faculty of Sport Sciences and Physical Education (Portugal). Equal involvement of all people in sports activities is, in terms of national and international guidelines, enshrined in some conventions and strategies in the field of sports, as well as human rights, social security, physical and mental health, architecture, environment and public administration. However, there is a gap between the practice and EU guidelines in terms of sustainable support for socially inclusive sports programs in the form of co-funding by state and local (municipal) resources. We observe considerable opacity in the regulation of the field. Given that there are both relevant programs and inclusive legislation and policies, we believe that the reason for the missing article is reflected in the undeveloped criteria for measuring social inclusion in sports. Major sports programs are usually co-funded based on crowds (number of involved athletes) and performance (sports score). In the field of social inclusion in sports, the criteria cannot be the same, as it is a smaller population. Therefore, the goals of inclusion in sports should not be the focused on competitive results but on opening equal opportunities for all, regardless of their psychophysical abilities. In the Meter Matters program, we are searching for criteria for co-funding social inclusion in sports through focus groups with coaches, social workers, psychologists and others professionals involved in inclusive sports programs in regular sports clubs and with athletes and their parents or guardians. Moreover, experts in the field of social inclusion in sports were also interviewed. Based on the proposals for measuring social inclusion in sports, we developed a model for co-funding socially inclusive sports programs.

Keywords: European project, meter matters, inclusion, sport

Procedia PDF Downloads 106
28673 Mathematical Modeling to Reach Stability Condition within Rosetta River Mouth, Egypt

Authors: Ali Masria , Abdelazim Negm, Moheb Iskander, Oliver C. Saavedra

Abstract:

Estuaries play an important role in exchanging water and providing a navigational pathway for ships. These zones are very sensitive and vulnerable to any interventions in coastal dynamics. Almost major of these inlets experience coastal problems such as severe erosion, and accretion. Rosetta promontory, Egypt is an example of this environment. It suffers from many coastal problems as erosion problem along the coastline and siltation problem inside the inlet. It is due to lack of water and sediment resources as a side effect of constructing the Aswan High dam. The shoaling of the inlet leads to hindering the navigation process of fishing boats, negative impacts to estuarine and salt marsh habitat and decrease the efficiency of the cross section to transfer the flow during emergencies to the sea. This paper aims to reach a new condition of stability of Rosetta Promontory by using coastal measures to control the sediment entering, and causes shoaling inside the inlet. These coastal measures include modifying the inlet cross section by using centered jetties, eliminate the coastal dynamic in the entrance using boundary jetties. This target is achieved by using a hydrodynamic model Coastal Modeling System (CMS). Extensive field data collection (hydrographic surveys, wave data, tide data, and bed morphology) is used to build and calibrate the model. About 20 scenarios were tested to reach a suitable solution that mitigate the coastal problems at the inlet. The results show that 360 m jetty in the eastern bank with system of sand bypass from the leeside of the jetty can stabilize the estuary.

Keywords: Rosetta promontory, erosion, sedimentation, inlet stability

Procedia PDF Downloads 582
28672 IoT: State-of-the-Art and Future Directions

Authors: Bashir Abdu Muzakkari, Aisha Umar Sulaiman, Mohamed Afendee Muhamad, Sanah Abdullahi Muaz

Abstract:

The field of the Internet of Things (IoT) is rapidly expanding and has the potential to completely change how we work, live, and interact with the world. The Internet of Things (IoT) is the term used to describe a network of networked physical objects, including machinery, vehicles, and buildings, which are equipped with electronics, software, sensors, and network connectivity. This review paper aims to provide a comprehensive overview of the current state of IoT, including its definition, key components, development history, and current applications. The paper will also discuss the challenges and opportunities presented by IoT, as well as its potential impact on various industries, such as healthcare, agriculture, and transportation. In addition, this paper will highlight the ethical and security concerns associated with IoT and the need for effective solutions to address these challenges. The paper concludes by highlighting the prospects of IoT and the directions for future research in this field.

Keywords: internet of things, IoT, sensors, network

Procedia PDF Downloads 166
28671 Conception of a Reliable Low Cost, Autonomous Explorative Hovercraft 1

Authors: A. Brand, S. Burgalat, E. Chastel, M. Jumeline, L. Teilhac

Abstract:

The paper presents actual benefits and drawbacks of a multidirectional Hovercraft conceived with limited resources and designed for indoor exploration. Recent developments in the field have led to apparition of very powerful automotive systems capable of very high calculation and exploration in complex unknown environments. They usually propose very complex algorithms, high precision/cost sensors and sometimes have heavy calculation consumption with complex data fusion. Those systems are usually powerful but have a certain price and the benefits may not be worth the cost, especially considering their hardware limitations and their power consumption. Present approach is to build a compromise between cost, power consumption and results preciseness.

Keywords: Hovercraft, indoor exploration, autonomous, multidirectional, wireless control

Procedia PDF Downloads 413
28670 Investigation of Delivery of Triple Play Data in GE-PON Fiber to the Home Network

Authors: Ashima Anurag Sharma

Abstract:

Optical fiber based networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This research paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparison between various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be decreases due to increase in bit error rate.

Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT

Procedia PDF Downloads 521
28669 Microarray Gene Expression Data Dimensionality Reduction Using PCA

Authors: Fuad M. Alkoot

Abstract:

Different experimental technologies such as microarray sequencing have been proposed to generate high-resolution genetic data, in order to understand the complex dynamic interactions between complex diseases and the biological system components of genes and gene products. However, the generated samples have a very large dimension reaching thousands. Therefore, hindering all attempts to design a classifier system that can identify diseases based on such data. Additionally, the high overlap in the class distributions makes the task more difficult. The data we experiment with is generated for the identification of autism. It includes 142 samples, which is small compared to the large dimension of the data. The classifier systems trained on this data yield very low classification rates that are almost equivalent to a guess. We aim at reducing the data dimension and improve it for classification. Here, we experiment with applying a multistage PCA on the genetic data to reduce its dimensionality. Results show a significant improvement in the classification rates which increases the possibility of building an automated system for autism detection.

Keywords: PCA, gene expression, dimensionality reduction, classification, autism

Procedia PDF Downloads 556
28668 Topological Quantum Diffeomorphisms in Field Theory and the Spectrum of the Space-Time

Authors: Francisco Bulnes

Abstract:

Through the Fukaya conjecture and the wrapped Floer cohomology, the correspondences between paths in a loop space and states of a wrapping space of states in a Hamiltonian space (the ramification of field in this case is the connection to the operator that goes from TM to T*M) are demonstrated where these last states are corresponding to bosonic extensions of a spectrum of the space-time or direct image of the functor Spec, on space-time. This establishes a distinguished diffeomorphism defined by the mapping from the corresponding loops space to wrapping category of the Floer cohomology complex which furthermore relates in certain proportion D-branes (certain D-modules) with strings. This also gives to place to certain conjecture that establishes equivalences between moduli spaces that can be consigned in a moduli identity taking as space-time the Hitchin moduli space on G, whose dual can be expressed by a factor of a bosonic moduli spaces.

Keywords: Floer cohomology, Fukaya conjecture, Lagrangian submanifolds, quantum topological diffeomorphism

Procedia PDF Downloads 305
28667 Significance of Water Saving through Subsurface Drip Irrigation for Date Palm Trees

Authors: Ahmed I. Al-Amoud

Abstract:

A laboratory and field study were conducted on subsurface drip irrigation systems. In the first laboratory study, eight subsurface drip irrigation lines available locally, were selected and a number of experiments were made to evaluate line hydraulic characteristics to insure it's suitability for drip irrigation design requirements and high performance to select the best for field experiments. The second study involves field trials on mature date palm trees to study the effect of subsurface drip irrigation system on the yield and water consumption of date palms, and to compare that with the traditional surface drip irrigation system. Experiments were conducted in Alwatania Agricultural Project, on 50 mature palm trees (17 years old) of Helwa type with 10 meters spacing between rows and between trees. A high efficiency subsurface line (Techline) was used based on the results of the first study. Irrigation scheduling was made through a soil moisture sensing device to ensure enough soil water levels in the soil. Experiment layouts were installed during 2001 season, measurements continued till end of 2008 season. Results have indicated that there is an increase in the yield and a considerable saving in water compared to the conventional drip irrigation method. In addition there were high increases in water use efficiency using the subsurface system. The subsurface system proves to be durable and highly efficient for irrigating date palm trees.

Keywords: drip irrigation, subsurface drip irrigation, date palm trees, date palm water use, date palm yield

Procedia PDF Downloads 424
28666 Semantic-Based Collaborative Filtering to Improve Visitor Cold Start in Recommender Systems

Authors: Baba Mbaye

Abstract:

In collaborative filtering recommendation systems, a user receives suggested items based on the opinions and evaluations of a community of users. This type of recommendation system uses only the information (notes in numerical values) contained in a usage matrix as input data. This matrix can be constructed based on users' behaviors or by offering users to declare their opinions on the items they know. The cold start problem leads to very poor performance for new users. It is a phenomenon that occurs at the beginning of use, in the situation where the system lacks data to make recommendations. There are three types of cold start problems: cold start for a new item, a new system, and a new user. We are interested in this article at the cold start for a new user. When the system welcomes a new user, the profile exists but does not have enough data, and its communities with other users profiles are still unknown. This leads to recommendations not adapted to the profile of the new user. In this paper, we propose an approach that improves cold start by using the notions of similarity and semantic proximity between users profiles during cold start. We will use the cold-metadata available (metadata extracted from the new user's data) useful in positioning the new user within a community. The aim is to look for similarities and semantic proximities with the old and current user profiles of the system. Proximity is represented by close concepts considered to belong to the same group, while similarity groups together elements that appear similar. Similarity and proximity are two close but not similar concepts. This similarity leads us to the construction of similarity which is based on: a) the concepts (properties, terms, instances) independent of ontology structure and, b) the simultaneous representation of the two concepts (relations, presence of terms in a document, simultaneous presence of the authorities). We propose an ontology, OIVCSRS (Ontology of Improvement Visitor Cold Start in Recommender Systems), in order to structure the terms and concepts representing the meaning of an information field, whether by the metadata of a namespace, or the elements of a knowledge domain. This approach allows us to automatically attach the new user to a user community, partially compensate for the data that was not initially provided and ultimately to associate a better first profile with the cold start. Thus, the aim of this paper is to propose an approach to improving cold start using semantic technologies.

Keywords: visitor cold start, recommender systems, collaborative filtering, semantic filtering

Procedia PDF Downloads 213
28665 Stable Time Reversed Integration of the Navier-Stokes Equation Using an Adjoint Gradient Method

Authors: Jurriaan Gillissen

Abstract:

This work is concerned with stabilizing the numerical integration of the Navier-Stokes equation (NSE), backwards in time. Applications involve the detection of sources of, e.g., sound, heat, and pollutants. Stable reverse numerical integration of parabolic differential equations is also relevant for image de-blurring. While the literature addresses the reverse integration problem of the advection-diffusion equation, the problem of numerical reverse integration of the NSE has, to our knowledge, not yet been addressed. Owing to the presence of viscosity, the NSE is irreversible, i.e., when going backwards in time, the fluid behaves, as if it had a negative viscosity. As an effect, perturbations from the perfect solution, due to round off errors or discretization errors, grow exponentially in time, and reverse integration of the NSE is inherently unstable, regardless of using an implicit time integration scheme. Consequently, some sort of filtering is required, in order to achieve a stable, numerical, reversed integration. The challenge is to find a filter with a minimal adverse affect on the accuracy of the reversed integration. In the present work, we explore an adjoint gradient method (AGM) to achieve this goal, and we apply this technique to two-dimensional (2D), decaying turbulence. The AGM solves for the initial velocity field u0 at t = 0, that, when integrated forward in time, produces a final velocity field u1 at t = 1, that is as close as is feasibly possible to some specified target field v1. The initial field u0 defines a minimum of a cost-functional J, that measures the distance between u1 and v1. In the minimization procedure, the u0 is updated iteratively along the gradient of J w.r.t. u0, where the gradient is obtained by transporting J backwards in time from t = 1 to t = 0, using the adjoint NSE. The AGM thus effectively replaces the backward integration by multiple forward and backward adjoint integrations. Since the viscosity is negative in the adjoint NSE, each step of the AGM is numerically stable. Nevertheless, when applied to turbulence, the AGM develops instabilities, which limit the backward integration to small times. This is due to the exponential divergence of phase space trajectories in turbulent flow, which produces a multitude of local minima in J, when the integration time is large. As an effect, the AGM may select unphysical, noisy initial conditions. In order to improve this situation, we propose two remedies. First, we replace the integration by a sequence of smaller integrations, i.e., we divide the integration time into segments, where in each segment the target field v1 is taken as the initial field u0 from the previous segment. Second, we add an additional term (regularizer) to J, which is proportional to a high-order Laplacian of u0, and which dampens the gradients of u0. We show that suitable values for the segment size and for the regularizer, allow a stable reverse integration of 2D decaying turbulence, with accurate results for more then O(10) turbulent, integral time scales.

Keywords: time reversed integration, parabolic differential equations, adjoint gradient method, two dimensional turbulence

Procedia PDF Downloads 218
28664 Singular Perturbed Vector Field Method Applied to the Problem of Thermal Explosion of Polydisperse Fuel Spray

Authors: Ophir Nave

Abstract:

In our research, we present the concept of singularly perturbed vector field (SPVF) method, and its application to thermal explosion of diesel spray combustion. Given a system of governing equations, which consist of hidden Multi-scale variables, the SPVF method transfer and decompose such system to fast and slow singularly perturbed subsystems (SPS). The SPVF method enables us to understand the complex system, and simplify the calculations. Later powerful analytical, numerical and asymptotic methods (e.g method of integral (invariant) manifold (MIM), the homotopy analysis method (HAM) etc.) can be applied to each subsystem. We compare the results obtained by the methods of integral invariant manifold and SPVF apply to spray droplets combustion model. The research deals with the development of an innovative method for extracting fast and slow variables in physical mathematical models. The method that we developed called singular perturbed vector field. This method based on a numerical algorithm applied to global quasi linearization applied to given physical model. The SPVF method applied successfully to combustion processes. Our results were compared to experimentally results. The SPVF is a general numerical and asymptotical method that reveals the hierarchy (multi-scale system) of a given system.

Keywords: polydisperse spray, model reduction, asymptotic analysis, multi-scale systems

Procedia PDF Downloads 213
28663 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model

Authors: S. A. Sadegh Zadeh, C. Kambhampati

Abstract:

Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.

Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential

Procedia PDF Downloads 610
28662 Rainfall and Flood Forecast Models for Better Flood Relief Plan of the Mae Sot Municipality

Authors: S. Chuenchooklin, S. Taweepong, U. Pangnakorn

Abstract:

This research was conducted in the Mae Sot Watershed whereas located in the Moei River Basin at the Upper Salween River Basin in Tak Province, Thailand. The Mae Sot Municipality is the largest urbanized in Tak Province and situated in the midstream of the Mae Sot Watershed. It usually faces flash flood problem after heavy rain due to poor flood management has been reported since economic rapidly bloom up in recently years. Its catchment can be classified as ungauged basin with lack of rainfall data and no any stream gaging station was reported. It was attached by most severely flood event in 2013 as the worst studied case for those all communities in this municipality. Moreover, other problems are also faced in this watershed such shortage water supply for domestic consumption and agriculture utilizations including deterioration of water quality and landslide as well. The research aimed to increase capability building and strengthening the participation of those local community leaders and related agencies to conduct better water management in urban area was started by mean of the data collection and illustration of appropriated application of some short period rainfall forecasting model as the aim for better flood relief plan and management through the hydrologic model system and river analysis system programs. The authors intended to apply the global rainfall data via the integrated data viewer (IDV) program from the Unidata with the aim for rainfall forecasting in short period of 7 - 10 days in advance during rainy season instead of real time record. The IDV product can be present in advance period of rainfall with time step of 3 - 6 hours was introduced to the communities. The result can be used to input to either the hydrologic modeling system model (HEC-HMS) or the soil water assessment tool model (SWAT) for synthesizing flood hydrographs and use for flood forecasting as well. The authors applied the river analysis system model (HEC-RAS) to present flood flow behaviors in the reach of the Mae Sot stream via the downtown of the Mae Sot City as flood extents as water surface level at every cross-sectional profiles of the stream. Both models of HMS and RAS were tested in 2013 with observed rainfall and inflow-outflow data from the Mae Sot Dam. The result of HMS showed fit to the observed data at dam and applied at upstream boundary discharge to RAS in order to simulate flood extents and tested in the field, and the result found satisfied. The result of IDV’s rainfall forecast data was compared to observed data and found fair. However, it is an appropriate tool to use in the ungauged catchment to use with flood hydrograph and river analysis models for future efficient flood relief plan and management.

Keywords: global rainfall, flood forecast, hydrologic modeling system, river analysis system

Procedia PDF Downloads 347
28661 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 66
28660 A Machine Learning Approach to Detecting Evasive PDF Malware

Authors: Vareesha Masood, Ammara Gul, Nabeeha Areej, Muhammad Asif Masood, Hamna Imran

Abstract:

The universal use of PDF files has prompted hackers to use them for malicious intent by hiding malicious codes in their victim’s PDF machines. Machine learning has proven to be the most efficient in identifying benign files and detecting files with PDF malware. This paper has proposed an approach using a decision tree classifier with parameters. A modern, inclusive dataset CIC-Evasive-PDFMal2022, produced by Lockheed Martin’s Cyber Security wing is used. It is one of the most reliable datasets to use in this field. We designed a PDF malware detection system that achieved 99.2%. Comparing the suggested model to other cutting-edge models in the same study field, it has a great performance in detecting PDF malware. Accordingly, we provide the fastest, most reliable, and most efficient PDF Malware detection approach in this paper.

Keywords: PDF, PDF malware, decision tree classifier, random forest classifier

Procedia PDF Downloads 83
28659 Evaluation of Commercial Back-analysis Package in Condition Assessment of Railways

Authors: Shadi Fathi, Moura Mehravar, Mujib Rahman

Abstract:

Over the years,increased demands on railways, the emergence of high-speed trains and heavy axle loads, ageing, and deterioration of the existing tracks, is imposing costly maintenance actions on the railway sector. The need for developing a fast andcost-efficient non-destructive assessment method for the structural evaluation of railway tracksis therefore critically important. The layer modulus is the main parameter used in the structural design and evaluation of the railway track substructure (foundation). Among many recently developed NDTs, Falling Weight Deflectometer (FWD) test, widely used in pavement evaluation, has shown promising results for railway track substructure monitoring. The surface deflection data collected by FWD are used to estimate the modulus of substructure layers through the back-analysis technique. Although there are different commerciallyavailableback-analysis programs are used for pavement applications, there are onlya limited number of research-based techniques have been so far developed for railway track evaluation. In this paper, the suitability, accuracy, and reliability of the BAKFAAsoftware are investigated. The main rationale for selecting BAKFAA as it has a relatively straightforward user interfacethat is freely available and widely used in highway and airport pavement evaluation. As part of the study, a finite element (FE) model of a railway track section near Leominsterstation, Herefordshire, UK subjected to the FWD test, was developed and validated against available field data. Then, a virtual experimental database (including 218 sets of FWD testing data) was generated using theFE model and employed as the measured database for the BAKFAA software. This database was generated considering various layers’ moduli for each layer of track substructure over a predefined range. The BAKFAA predictions were compared against the cone penetration test (CPT) data (available from literature; conducted near to Leominster station same section as the FWD was performed). The results reveal that BAKFAA overestimatesthe layers’ moduli of each substructure layer. To adjust the BAKFA with the CPT data, this study introduces a correlation model to make the BAKFAA applicable in railway applications.

Keywords: back-analysis, bakfaa, railway track substructure, falling weight deflectometer (FWD), cone penetration test (CPT)

Procedia PDF Downloads 126
28658 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 88
28657 Effect of Self-Questioning Strategy on the Improvement of Reading Comprehension of ESL Learners

Authors: Muhammad Hamza

Abstract:

This research is based on the effect of self-questioning strategy on reading comprehension of second language learners at medium level. This research is conducted to find out the effects of self-questioning strategy and how self-questioning strategy helps English learners to improve their reading comprehension. In this research study the researcher has analyzed that how much self-questioning is effective in the field of learning second language and how much it helps second language learners to improve their reading comprehension. For this purpose, the researcher has studied different reading strategies, analyzed, collected data from certificate level class at NUML, Peshawar campus and then found out the effects of self-questioning strategy on reading comprehension of ESL learners. The researcher has randomly selected the participants from certificate class. The data was analyzed through pre-test and post-test and then in the final stage the results of both tests were compared. After the pre-test and post-test, the result of both pre-test and post-test indicated that if the learners start to use self-questioning strategy before reading a text, while reading a text and after reading a particular text there’ll be improvement in comprehension level of ESL learners. The present research has addressed the benefits of self-questioning strategy by taking two tests (pre and post-test).After the result of post-test it is revealed that the use of the self-questioning strategy has a significant effect on the readers’ comprehension thus, they can improve their reading comprehension by using self-questioning strategy.

Keywords: strategy, self-questioning, comprehension, intermediate level ESL learner

Procedia PDF Downloads 55
28656 The Effect of Varying Cone Beam Computed Tomography Image Resolution and Field-of-View Centralization on the Effective Radiation Dose

Authors: Fatima M. Jadu, Asmaa A. Alzahrani, Maha A. Almutairi, Salma O. Al-Amoudi, Mawya A. Khafaji

Abstract:

Introduction: Estimating the potential radiation risk for a widely used imaging technique such as cone beam CT (CBCT) is crucial. The aim of this study was to examine the effect of varying two CBCT technical factors, the voxel size (VOX) and the Field-of-View (FOV) centralization, on the radiation dose. Methodology: The head and neck slices of a RANDO® man phantom (Alderson Research Laboratories) were used with nanoDot™ OSLD dosimeters to measure the absorbed radiation dose at 25 predetermined sites. Imaging was done using the i-CAT® (Imaging Science International, Hatfield, PA, USA) CBCT unit. The VOX was changed for every three cycles of exposures from 0.2mm to 0.3mm and then 0.4mm. Then the FOV was centered on the maxilla and mandible alternatively while holding all other factors constant. Finally, the effective radiation dose was calculated for each view and voxel setting. Results: The effective radiation dose was greatest when the smallest VOX was chosen. When the FOV was centered on the maxilla, the highest radiation doses were recorded in the eyes and parotid glands. While on the mandible, the highest radiation doses were recorded in the sublingual and submandibular glands. Conclusion: Minor variations in the CBCT exposure factors significantly affect the effective radiation dose and thus the radiation risk to the patient. Therefore, extreme care must be taken when choosing these parameters especially for vulnerable patients such as children.

Keywords: CBCT, cone beam CT, effective dose, field of view, mandible, maxilla, resolution, voxel

Procedia PDF Downloads 259