Search results for: predictive models
3745 Computational Quantum Mechanics Study of Oxygen as Substitutional Atom in Diamond
Authors: K. M. Etmimi, A. A. Sghayer, A. M. Gsiea, A. M. Abutruma
Abstract:
Relatively few chemical species can be incorporated into diamond during CVD growth, and until recently the uptake of oxygen was thought to be low perhaps as a consequence of a short surface residence time. Within the literature, there is speculation regarding spectroscopic evidence for O in diamond, but no direct evidence. For example, the N3 and OK1 EPR centres have been tentatively assigned models made up from complexes of substitutional N and substitutional oxygen. In this study, we report density-functional calculations regarding the stability, electronic structures, geometry and hyperfine interaction of substitutional oxygen in diamond and show that the C2v, S=1 configuration very slightly lower in energy than the other configurations (C3v, Td, and C2v with S=0). The electronic structure of O in diamond generally gives rise to two defect-related energy states in the band gap one a non-degenerate a1 state lying near the middle of the energy gap and the other a threefold-degenerate t2 state located close to the conduction band edges. The anti-bonding a1 and t2 states will be occupied by one to three electrons for O+, O and O− respectively.Keywords: DFT, oxygen, diamond, hyperfine
Procedia PDF Downloads 3793744 Working Effectively with Muslim Communities in the West
Authors: Lisa Tribuzio
Abstract:
This paper explores the complexity of working with Muslim communities in Australia. It will draw upon the notions of belonging, social inclusion and effective community programming to engage Muslim communities in Western environments given the current global political climate. Factors taken into consideration for effective engagement include: family engagement, considering key practices such as Ramadan, fasting and prayer and food requirements, gender relations, core values around faith and spirituality, considering attitudes towards self disclosure in a counseling setting and the notion of Us and Them in the media and systems and its effect on minority communities. It will explore recent research in the field from Australian researchers as well as recommendations from United Nations in working with Muslim communities. It will also explore current practice models applied in Australia in engaging effectively with diverse communities and addressing racism and discrimination in innovative ways.Keywords: Muslim, cultural diversity, social inclusion, racism
Procedia PDF Downloads 4233743 Digital Transformation as the Subject of the Knowledge Model of the Discursive Space
Authors: Rafal Maciag
Abstract:
Due to the development of the current civilization, one must create suitable models of its pervasive massive phenomena. Such a phenomenon is the digital transformation, which has a substantial number of disciplined, methodical interpretations forming the diversified reflection. This reflection could be understood pragmatically as the current temporal, a local differential state of knowledge. The model of the discursive space is proposed as a model for the analysis and description of this knowledge. Discursive space is understood as an autonomous multidimensional space where separate discourses traverse specific trajectories of what can be presented in multidimensional parallel coordinate system. Discursive space built on the world of facts preserves the complex character of that world. Digital transformation as a discursive space has a relativistic character that means that at the same time, it is created by the dynamic discourses and these discourses are molded by the shape of this space.Keywords: complexity, digital transformation, discourse, discursive space, knowledge
Procedia PDF Downloads 1943742 Using Machine Learning to Enhance Win Ratio for College Ice Hockey Teams
Authors: Sadixa Sanjel, Ahmed Sadek, Naseef Mansoor, Zelalem Denekew
Abstract:
Collegiate ice hockey (NCAA) sports analytics is different from the national level hockey (NHL). We apply and compare multiple machine learning models such as Linear Regression, Random Forest, and Neural Networks to predict the win ratio for a team based on their statistics. Data exploration helps determine which statistics are most useful in increasing the win ratio, which would be beneficial to coaches and team managers. We ran experiments to select the best model and chose Random Forest as the best performing. We conclude with how to bridge the gap between the college and national levels of sports analytics and the use of machine learning to enhance team performance despite not having a lot of metrics or budget for automatic tracking.Keywords: NCAA, NHL, sports analytics, random forest, regression, neural networks, game predictions
Procedia PDF Downloads 1193741 Performance Analysis of Solar Air Heater with Fins and Perforated Twisted Tape Insert
Authors: Rajesh Kumar, Prabha Chand
Abstract:
The present paper deals with the analytical investigation on the thermal and thermo-hydraulic performance of the solar air collector fitted with fins and perforated twisted tapes (PTT) of twist ratio 2 with different axial pitch ratio. The mathematical models are presented, and the effect of mass flow rate and axial pitch ratios on the thermal and effective efficiency has been discussed. The results obtained are compared with the results of the solar air heater without fins and twisted tapes. Results conveyed that the collectors with fins and perforated twisted tape perform better but at the expense of increased pressure drop. Also, twisted tape with minimum axial pitch ratio is found to be more efficient than others.Keywords: solar air heater, thermal efficiency, twisted tape, twist ratio
Procedia PDF Downloads 2693740 Finite Element Analysis of the Anaconda Device: Efficiently Predicting the Location and Shape of a Deployed Stent
Authors: Faidon Kyriakou, William Dempster, David Nash
Abstract:
Abdominal Aortic Aneurysm (AAA) is a major life-threatening pathology for which modern approaches reduce the need for open surgery through the use of stenting. The success of stenting though is sometimes jeopardized by the final position of the stent graft inside the human artery which may result in migration, endoleaks or blood flow occlusion. Herein, a finite element (FE) model of the commercial medical device AnacondaTM (Vascutek, Terumo) has been developed and validated in order to create a numerical tool able to provide useful clinical insight before the surgical procedure takes place. The AnacondaTM device consists of a series of NiTi rings sewn onto woven polyester fabric, a structure that despite its column stiffness is flexible enough to be used in very tortuous geometries. For the purposes of this study, a FE model of the device was built in Abaqus® (version 6.13-2) with the combination of beam, shell and surface elements; the choice of these building blocks was made to keep the computational cost to a minimum. The validation of the numerical model was performed by comparing the deployed position of a full stent graft device inside a constructed AAA with a duplicate set-up in Abaqus®. Specifically, an AAA geometry was built in CAD software and included regions of both high and low tortuosity. Subsequently, the CAD model was 3D printed into a transparent aneurysm, and a stent was deployed in the lab following the steps of the clinical procedure. Images on the frontal and sagittal planes of the experiment allowed the comparison with the results of the numerical model. By overlapping the experimental and computational images, the mean and maximum distances between the rings of the two models were measured in the longitudinal, and the transverse direction and, a 5mm upper bound was set as a limit commonly used by clinicians when working with simulations. The two models showed very good agreement of their spatial positioning, especially in the less tortuous regions. As a result, and despite the inherent uncertainties of a surgical procedure, the FE model allows confidence that the final position of the stent graft, when deployed in vivo, can also be predicted with significant accuracy. Moreover, the numerical model run in just a few hours, an encouraging result for applications in the clinical routine. In conclusion, the efficient modelling of a complicated structure which combines thin scaffolding and fabric has been demonstrated to be feasible. Furthermore, the prediction capabilities of the location of each stent ring, as well as the global shape of the graft, has been shown. This can allow surgeons to better plan their procedures and medical device manufacturers to optimize their designs. The current model can further be used as a starting point for patient specific CFD analysis.Keywords: AAA, efficiency, finite element analysis, stent deployment
Procedia PDF Downloads 1953739 Drying Modeling of Banana Using Cellular Automata
Authors: M. Fathi, Z. Farhaninejad, M. Shahedi, M. Sadeghi
Abstract:
Drying is one of the oldest preservation methods for food and agriculture products. Appropriate control of operation can be obtained by modeling. Limitation of continues models for complex boundary condition and non-regular geometries leading to appearance of discrete novel methods such as cellular automata, which provides a platform for obtaining fast predictions by rule-based mathematics. In this research a one D dimensional CA was used for simulating thin layer drying of banana. Banana slices were dried with a convectional air dryer and experimental data were recorded for validating of final model. The model was programmed by MATLAB, run for 70000 iterations and von-Neumann neighborhood. The validation results showed a good accordance between experimental and predicted data (R=0.99). Cellular automata are capable to reproduce the expected pattern of drying and have a powerful potential for solving physical problems with reasonable accuracy and low calculating resources.Keywords: banana, cellular automata, drying, modeling
Procedia PDF Downloads 4423738 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups
Authors: Naushad Mamode Khan
Abstract:
The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL
Procedia PDF Downloads 3573737 Clustering of Panels and Shade Diffusion Techniques for Partially Shaded PV Array-Review
Authors: Shahida Khatoon, Mohd. Faisal Jalil, Vaishali Gautam
Abstract:
The Photovoltaic (PV) generated power is mainly dependent on environmental factors. The PV array’s lifetime and overall systems effectiveness reduce due to the partial shading condition. Clustering the electrical connections between solar modules is a viable strategy for minimizing these power losses by shade diffusion. This article comprehensively evaluates various PV array clustering/reconfiguration models for PV systems. These are static and dynamic reconfiguration techniques for extracting maximum power in mismatch conditions. This paper explores and analyzes current breakthroughs in solar PV performance improvement strategies that merit further investigation. Altogether, researchers and academicians working in the field of dedicated solar power generation will benefit from this research.Keywords: static reconfiguration, dynamic reconfiguration, photo voltaic array, partial shading, CTC configuration
Procedia PDF Downloads 1213736 An Overview of New Era in Food Science and Technology
Authors: Raana Babadi Fathipour
Abstract:
Strict prerequisites of logical diaries united ought to demonstrate the exploratory information is (in)significant from the statistical point of view and has driven a soak increment within the utilization and advancement of the factual program. It is essential that the utilization of numerical and measurable strategies, counting chemometrics and many other factual methods/algorithms in nourishment science and innovation has expanded steeply within the final 20 a long time. Computational apparatuses accessible can be utilized not as it were to run factual investigations such as univariate and bivariate tests as well as multivariate calibration and improvement of complex models but also to run reenactments of distinctive scenarios considering a set of inputs or essentially making expectations for particular information sets or conditions. Conducting a fast look within the most legitimate logical databases (Pubmed, ScienceDirect, Scopus), it is conceivable to watch that measurable strategies have picked up a colossal space in numerous regions.Keywords: food science, food technology, food safety, computational tools
Procedia PDF Downloads 713735 Simulation Model of Induction Heating in COMSOL Multiphysics
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
The induction heating phenomenon depends on various factors, making the problem highly nonlinear. The mathematical analysis of this problem in most cases is very difficult and it is reduced to simple cases. Another knowledge of induction heating systems is generated in production environments, but these trial-error procedures are long and expensive. The numerical models of induction heating problem are another approach to reduce abovementioned drawbacks. This paper deals with the simulation model of induction heating problem. The simulation model of induction heating system in COMSOL Multiphysics is created. In this work we present results of numerical simulations of induction heating process in pieces of cylindrical shapes, in an inductor with four coils. The modeling of the inducting heating process was made with the software COMSOL Multiphysics Version 4.2a, for the study we present the temperature charts.Keywords: induction heating, electromagnetic field, inductor, numerical simulation, finite element
Procedia PDF Downloads 3183734 Hydrological-Economic Modeling of Two Hydrographic Basins of the Coast of Peru
Authors: Julio Jesus Salazar, Manuel Andres Jesus De Lama
Abstract:
There are very few models that serve to analyze the use of water in the socio-economic process. On the supply side, the joint use of groundwater has been considered in addition to the simple limits on the availability of surface water. In addition, we have worked on waterlogging and the effects on water quality (mainly salinity). In this paper, a 'complex' water economy is examined; one in which demands grow differentially not only within but also between sectors, and one in which there are limited opportunities to increase consumptive use. In particular, high-value growth, the growth of the production of irrigated crops of high value within the basins of the case study, together with the rapidly growing urban areas, provides a rich context to examine the general problem of water management at the basin level. At the same time, the long-term aridity of nature has made the eco-environment in the basins located on the coast of Peru very vulnerable, and the exploitation and immediate use of water resources have further deteriorated the situation. The presented methodology is the optimization with embedded simulation. The wide basin simulation of flow and water balances and crop growth are embedded with the optimization of water allocation, reservoir operation, and irrigation scheduling. The modeling framework is developed from a network of river basins that includes multiple nodes of origin (reservoirs, aquifers, water courses, etc.) and multiple demand sites along the river, including places of consumptive use for agricultural, municipal and industrial, and uses of running water on the coast of Peru. The economic benefits associated with water use are evaluated for different demand management instruments, including water rights, based on the production and benefit functions of water use in the urban agricultural and industrial sectors. This work represents a new effort to analyze the use of water at the regional level and to evaluate the modernization of the integrated management of water resources and socio-economic territorial development in Peru. It will also allow the establishment of policies to improve the process of implementation of the integrated management and development of water resources. The input-output analysis is essential to present a theory about the production process, which is based on a particular type of production function. Also, this work presents the Computable General Equilibrium (CGE) version of the economic model for water resource policy analysis, which was specifically designed for analyzing large-scale water management. As to the platform for CGE simulation, GEMPACK, a flexible system for solving CGE models, is used for formulating and solving CGE model through the percentage-change approach. GEMPACK automates the process of translating the model specification into a model solution program.Keywords: water economy, simulation, modeling, integration
Procedia PDF Downloads 1563733 Continuous and Discontinuos Modeling of Wellbore Instability in Anisotropic Rocks
Authors: C. Deangeli, P. Obentaku Obenebot, O. Omwanghe
Abstract:
The study focuses on the analysis of wellbore instability in rock masses affected by weakness planes. The occurrence of failure in such a type of rocks can occur in the rock matrix and/ or along the weakness planes, in relation to the mud weight gradient. In this case the simple Kirsch solution coupled with a failure criterion cannot supply a suitable scenario for borehole instabilities. Two different numerical approaches have been used in order to investigate the onset of local failure at the wall of a borehole. For each type of approach the influence of the inclination of weakness planes has been investigates, by considering joint sets at 0°, 35° and 90° to the horizontal. The first set of models have been carried out with FLAC 2D (Fast Lagrangian Analysis of Continua) by considering the rock material as a continuous medium, with a Mohr Coulomb criterion for the rock matrix and using the ubiquitous joint model for accounting for the presence of the weakness planes. In this model yield may occur in either the solid or along the weak plane, or both, depending on the stress state, the orientation of the weak plane and the material properties of the solid and weak plane. The second set of models have been performed with PFC2D (Particle Flow code). This code is based on the Discrete Element Method and considers the rock material as an assembly of grains bonded by cement-like materials, and pore spaces. The presence of weakness planes is simulated by the degradation of the bonds between grains along given directions. In general the results of the two approaches are in agreement. However the discrete approach seems to capture more complex phenomena related to local failure in the form of grain detachment at wall of the borehole. In fact the presence of weakness planes in the discontinuous medium leads to local instability along the weak planes also in conditions not predicted from the continuous solution. In general slip failure locations and directions do not follow the conventional wellbore breakout direction but depend upon the internal friction angle and the orientation of the bedding planes. When weakness plane is at 0° and 90° the behaviour are similar to that of a continuous rock material, but borehole instability is more severe when weakness planes are inclined at an angle between 0° and 90° to the horizontal. In conclusion, the results of the numerical simulations show that the prediction of local failure at the wall of the wellbore cannot disregard the presence of weakness planes and consequently the higher mud weight required for stability for any specific inclination of the joints. Despite the discrete approach can simulate smaller areas because of the large number of particles required for the generation of the rock material, however it seems to investigate more correctly the occurrence of failure at the miscroscale and eventually the propagation of the failed zone to a large portion of rock around the wellbore.Keywords: continuous- discontinuous, numerical modelling, weakness planes wellbore, FLAC 2D
Procedia PDF Downloads 5023732 Naphtha Catalytic Reform: Modeling and Simulation of Unity
Authors: Leal Leonardo, Pires Carlos Augusto de Moraes, Casiraghi Magela
Abstract:
In this work were realized the modeling and simulation of the catalytic reformer process, of ample form, considering all the equipment that influence the operation performance. Considered it a semi-regenerative reformer, with four reactors in series intercalated with four furnaces, two heat exchanges, one product separator and one recycle compressor. A simplified reactional system was considered, involving only ten chemical compounds related through five reactions. The considered process was the applied to aromatics production (benzene, toluene, and xylene). The models developed to diverse equipment were interconnecting in a simulator that consists of a computer program elaborate in FORTRAN 77. The simulation of the global model representative of reformer unity achieved results that are compatibles with the literature ones. It was then possible to study the effects of operational variables in the products concentration and in the performance of the unity equipment.Keywords: catalytic reforming, modeling, simulation, petrochemical engineering
Procedia PDF Downloads 5193731 Estimating Big Five Personality Expressions with a Tiered Information Framework
Authors: Laura Kahn, Paul Rodrigues, Onur Savas, Shannon Hahn
Abstract:
An empirical understanding of an individual's personality expression can have a profound impact on organizations seeking to strengthen team performance and improve employee retention. A team's personality composition can impact overall performance. Creating a tiered information framework that leverages proxies for a user's social context and lexical and linguistic content provides insight into location-specific personality expression. We leverage the layered framework to examine domain-specific, psychological, and lexical cues within social media posts. We apply DistilBERT natural language transfer learning models with real world data to examine the relationship between Big Five personality expressions of people in Science, Technology, Engineering and Math (STEM) fields.Keywords: big five, personality expression, social media analysis, workforce development
Procedia PDF Downloads 1423730 Cubic Trigonometric B-Spline Approach to Numerical Solution of Wave Equation
Authors: Shazalina Mat Zin, Ahmad Abd. Majid, Ahmad Izani Md. Ismail, Muhammad Abbas
Abstract:
The generalized wave equation models various problems in sciences and engineering. In this paper, a new three-time level implicit approach based on cubic trigonometric B-spline for the approximate solution of wave equation is developed. The usual finite difference approach is used to discretize the time derivative while cubic trigonometric B-spline is applied as an interpolating function in the space dimension. Von Neumann stability analysis is used to analyze the proposed method. Two problems are discussed to exhibit the feasibility and capability of the method. The absolute errors and maximum error are computed to assess the performance of the proposed method. The results were found to be in good agreement with known solutions and with existing schemes in literature.Keywords: collocation method, cubic trigonometric B-spline, finite difference, wave equation
Procedia PDF Downloads 5443729 Comparison of Equivalent Linear and Non-Linear Site Response Model Performance in Kathmandu Valley
Authors: Sajana Suwal, Ganesh R. Nhemafuki
Abstract:
Evaluation of ground response under earthquake shaking is crucial in geotechnical earthquake engineering. Damage due to seismic excitation is mainly correlated to local geological and geotechnical conditions. It is evident from the past earthquakes (e.g. 1906 San Francisco, USA, 1923 Kanto, Japan) that the local geology has strong influence on amplitude and duration of ground motions. Since then significant studies has been conducted on ground motion amplification revealing the importance of influence of local geology on ground. Observations from the damaging earthquakes (e.g. Nigata and San Francisco, 1964; Irpinia, 1980; Mexico, 1985; Kobe, 1995; L’Aquila, 2009) divulged that non-uniform damage pattern, particularly in soft fluvio-lacustrine deposit is due to the local amplification of seismic ground motion. Non-uniform damage patterns are also observed in Kathmandu Valley during 1934 Bihar Nepal earthquake and recent 2015 Gorkha earthquake seemingly due to the modification of earthquake ground motion parameters. In this study, site effects resulting from amplification of soft soil in Kathmandu are presented. A large amount of subsoil data was collected and used for defining the appropriate subsoil model for the Kathamandu valley. A comparative study of one-dimensional total-stress equivalent linear and non-linear site response is performed using four strong ground motions for six sites of Kathmandu valley. In general, one-dimensional (1D) site-response analysis involves the excitation of a soil profile using the horizontal component and calculating the response at individual soil layers. In the present study, both equivalent linear and non-linear site response analyses were conducted using the computer program DEEPSOIL. The results show that there is no significant deviation between equivalent linear and non-linear site response models until the maximum strain reaches to 0.06-0.1%. Overall, it is clearly observed from the results that non-linear site response model perform better as compared to equivalent linear model. However, the significant deviation between two models is resulted from other influencing factors such as assumptions made in 1D site response, lack of accurate values of shear wave velocity and nonlinear properties of the soil deposit. The results are also presented in terms of amplification factors which are predicted to be around four times more in case of non-linear analysis as compared to equivalent linear analysis. Hence, the nonlinear behavior of soil prevails the urgent need of study of dynamic characteristics of the soft soil deposit that can specifically represent the site-specific design spectra for the Kathmandu valley for building resilient structures from future damaging earthquakes.Keywords: deep soil, equivalent linear analysis, non-linear analysis, site response
Procedia PDF Downloads 2953728 Effect of Chemical Fertilizer on Plant Growth-Promoting Rhizobacteria in Wheat
Authors: Tessa E. Reid, Vanessa N. Kavamura, Maider Abadie, Adriana Torres-Ballesteros, Mark Pawlett, Ian M. Clark, Jim Harris, Tim Mauchline
Abstract:
The deleterious effect of chemical fertilizer on rhizobacterial diversity has been well documented using 16S rRNA gene amplicon sequencing and predictive metagenomics. Biofertilization is a cost-effective and sustainable alternative; improving strategies depends on isolating beneficial soil microorganisms. Although culturing is widespread in biofertilization, it is unknown whether the composition of cultured isolates closely mirrors native beneficial rhizobacterial populations. This study aimed to determine the relative abundance of culturable plant growth-promoting rhizobacteria (PGPR) isolates within total soil DNA and how potential PGPR populations respond to chemical fertilization in a commercial wheat variety. It was hypothesized that PGPR will be reduced in fertilized relative to unfertilized wheat. Triticum aestivum cv. Cadenza seeds were sown in a nutrient depleted agricultural soil in pots treated with and without nitrogen-phosphorous-potassium (NPK) fertilizer. Rhizosphere and rhizoplane samples were collected at flowering stage (10 weeks) and analyzed by culture-independent (amplicon sequence variance (ASV) analysis of total rhizobacterial DNA) and -dependent (isolation using growth media) techniques. Rhizosphere- and rhizoplane-derived microbiota culture collections were tested for plant growth-promoting traits using functional bioassays. In general, fertilizer addition decreased the proportion of nutrient-solubilizing bacteria (nitrate, phosphate, potassium, iron and, zinc) isolated from rhizocompartments in wheat, whereas salt tolerant bacteria were not affected. A PGPR database was created from isolate 16S rRNA gene sequences and searched against total soil DNA, revealing that 1.52% of total community ASVs were identified as culturable PGPR isolates. Bioassays identified a higher proportion of PGPR in non-fertilized samples (rhizosphere (49%) and rhizoplane (91%)) compared to fertilized samples (rhizosphere (21%) and rhizoplane (19%)) which constituted approximately 1.95% and 1.25% in non-fertilized and fertilized total community DNA, respectively. The analyses of 16S rRNA genes and deduced functional profiles provide an in-depth understanding of the responses of bacterial communities to fertilizer; this study suggests that rhizobacteria, which potentially benefit plants by mobilizing insoluble nutrients in soil, are reduced by chemical fertilizer addition. This knowledge will benefit the development of more targeted biofertilization strategies.Keywords: bacteria, fertilizer, microbiome, rhizoplane, rhizosphere
Procedia PDF Downloads 3113727 Distorted Document Images Dataset for Text Detection and Recognition
Authors: Ilia Zharikov, Philipp Nikitin, Ilia Vasiliev, Vladimir Dokholyan
Abstract:
With the increasing popularity of document analysis and recognition systems, text detection (TD) and optical character recognition (OCR) in document images become challenging tasks. However, according to our best knowledge, no publicly available datasets for these particular problems exist. In this paper, we introduce a Distorted Document Images dataset (DDI-100) and provide a detailed analysis of the DDI-100 in its current state. To create the dataset we collected 7000 unique document pages, and extend it by applying different types of distortions and geometric transformations. In total, DDI-100 contains more than 100,000 document images together with binary text masks, text and character locations in terms of bounding boxes. We also present an analysis of several state-of-the-art TD and OCR approaches on the presented dataset. Lastly, we demonstrate the usefulness of DDI-100 to improve accuracy and stability of the considered TD and OCR models.Keywords: document analysis, open dataset, optical character recognition, text detection
Procedia PDF Downloads 1793726 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 1303725 Using Combination of Sets of Features of Molecules for Aqueous Solubility Prediction: A Random Forest Model
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Generally, absorption and bioavailability increase if solubility increases; therefore, it is crucial to predict them in drug discovery applications. Molecular descriptors and Molecular properties are traditionally used for the prediction of water solubility. There are various key descriptors that are used for this purpose, namely Drogan Descriptors, Morgan Descriptors, Maccs keys, etc., and each has different prediction capabilities with differentiating successes between different data sets. Another source for the prediction of solubility is structural features; they are commonly used for the prediction of solubility. However, there are little to no studies that combine three or more properties or descriptors for prediction to produce a more powerful prediction model. Unlike available models, we used a combination of those features in a random forest machine learning model for improved solubility prediction to better predict and, therefore, contribute to drug discovery systems.Keywords: solubility, random forest, molecular descriptors, maccs keys
Procedia PDF Downloads 503724 Managers’ Mobile Information Behavior in an Openness Paradigm Era
Authors: Abd Latif Abdul Rahman, Zuraidah Arif, Muhammad Faizal Iylia, Mohd Ghazali, Asmadi Mohammed Ghazali
Abstract:
Mobile information is a significant access point for human information activities. Theories and models of human information behavior have developed over several decades but have not yet considered the role of the user’s computing device in digital information interactions. This paper reviews the literature that leads to developing a conceptual framework of a study on the managers mobile information behavior. Based on the literature review, dimensions of mobile information behavior are identified, namely, dimension information needs, dimension information access, information retrieval and dimension of information use. The study is significant to understand the nature of librarians’ behavior in searching, retrieving and using information via the mobile device. Secondly, the study would provide suggestions about various kinds of mobile applications which organization can provide for their staff to improve their services.Keywords: mobile information behavior, information behavior, mobile information, mobile devices
Procedia PDF Downloads 3523723 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel
Authors: Tarek Litim, Ouahiba Taamallah
Abstract:
The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA
Procedia PDF Downloads 1963722 Forecasting Issues in Energy Markets within a Reg-ARIMA Framework
Authors: Ilaria Lucrezia Amerise
Abstract:
Electricity markets throughout the world have undergone substantial changes. Accurate, reliable, clear and comprehensible modeling and forecasting of different variables (loads and prices in the first instance) have achieved increasing importance. In this paper, we describe the actual state of the art focusing on reg-SARMA methods, which have proven to be flexible enough to accommodate the electricity price/load behavior satisfactory. More specifically, we will discuss: 1) The dichotomy between point and interval forecasts; 2) The difficult choice between stochastic (e.g. climatic variation) and non-deterministic predictors (e.g. calendar variables); 3) The confrontation between modelling a single aggregate time series or creating separated and potentially different models of sub-series. The noteworthy point that we would like to make it emerge is that prices and loads require different approaches that appear irreconcilable even though must be made reconcilable for the interests and activities of energy companies.Keywords: interval forecasts, time series, electricity prices, reg-SARIMA methods
Procedia PDF Downloads 1343721 Secure Optical Communication System Using Quantum Cryptography
Authors: Ehab AbdulRazzaq Hussein
Abstract:
Quantum cryptography (QC) is an emerging technology for secure key distribution with single-photon transmissions. In contrast to classical cryptographic schemes, the security of QC schemes is guaranteed by the fundamental laws of nature. Their security stems from the impossibility to distinguish non-orthogonal quantum states with certainty. A potential eavesdropper introduces errors in the transmissions, which can later be discovered by the legitimate participants of the communication. In this paper, the modeling approach is proposed for QC protocol BB84 using polarization coding. The single-photon system is assumed to be used in the designed models. Thus, Eve cannot use beam-splitting strategy to eavesdrop on the quantum channel transmission. The only eavesdropping strategy possible to Eve is the intercept/resend strategy. After quantum transmission of the QC protocol, the quantum bit error rate (QBER) is estimated and compared with a threshold value. If it is above this value the procedure must be stopped and performed later again.Keywords: security, key distribution, cryptography, quantum protocols, Quantum Cryptography (QC), Quantum Key Distribution (QKD).
Procedia PDF Downloads 4093720 Probing Mechanical Mechanism of Three-Hinge Formation on a Growing Brain: A Numerical and Experimental Study
Authors: Mir Jalil Razavi, Tianming Liu, Xianqiao Wang
Abstract:
Cortical folding, characterized by convex gyri and concave sulci, has an intrinsic relationship to the brain’s functional organization. Understanding the mechanism of the brain’s convoluted patterns can provide useful clues into normal and pathological brain function. During the development, the cerebral cortex experiences a noticeable expansion in volume and surface area accompanied by tremendous tissue folding which may be attributed to many possible factors. Despite decades of endeavors, the fundamental mechanism and key regulators of this crucial process remain incompletely understood. Therefore, to taking even a small role in unraveling of brain folding mystery, we present a mechanical model to find mechanism of 3-hinges formation in a growing brain that it has not been addressed before. A 3-hinge is defined as a gyral region where three gyral crests (hinge-lines) join. The reasons that how and why brain prefers to develop 3-hinges have not been answered very well. Therefore, we offer a theoretical and computational explanation to mechanism of 3-hinges formation in a growing brain and validate it by experimental observations. In theoretical approach, the dynamic behavior of brain tissue is examined and described with the aid of a large strain and nonlinear constitutive model. Derived constitute model is used in the computational model to define material behavior. Since the theoretical approach cannot predict the evolution of cortical complex convolution after instability, non-linear finite element models are employed to study the 3-hinges formation and secondary morphological folds of the developing brain. Three-dimensional (3D) finite element analyses on a multi-layer soft tissue model which mimics a small piece of the brain are performed to investigate the fundamental mechanism of consistent hinge formation in the cortical folding. Results show that after certain amount growth of cortex, mechanical model starts to be unstable and then by formation of creases enters to a new configuration with lower strain energy. By further growth of the model, formed shallow creases start to form convoluted patterns and then develop 3-hinge patterns. Simulation results related to 3-hinges in models show good agreement with experimental observations from macaque, chimpanzee and human brain images. These results have great potential to reveal fundamental principles of brain architecture and to produce a unified theoretical framework that convincingly explains the intrinsic relationship between cortical folding and 3-hinges formation. This achieved fundamental understanding of the intrinsic relationship between cortical folding and 3-hinges formation would potentially shed new insights into the diagnosis of many brain disorders such as schizophrenia, autism, lissencephaly and polymicrogyria.Keywords: brain, cortical folding, finite element, three hinge
Procedia PDF Downloads 2383719 A Comparison of Neural Network and DOE-Regression Analysis for Predicting Resource Consumption of Manufacturing Processes
Authors: Frank Kuebler, Rolf Steinhilper
Abstract:
Artificial neural networks (ANN) as well as Design of Experiments (DOE) based regression analysis (RA) are mainly used for modeling of complex systems. Both methodologies are commonly applied in process and quality control of manufacturing processes. Due to the fact that resource efficiency has become a critical concern for manufacturing companies, these models needs to be extended to predict resource-consumption of manufacturing processes. This paper describes an approach to use neural networks as well as DOE based regression analysis for predicting resource consumption of manufacturing processes and gives a comparison of the achievable results based on an industrial case study of a turning process.Keywords: artificial neural network, design of experiments, regression analysis, resource efficiency, manufacturing process
Procedia PDF Downloads 5273718 Solutions of Fractional Reaction-Diffusion Equations Used to Model the Growth and Spreading of Biological Species
Authors: Kamel Al-Khaled
Abstract:
Reaction-diffusion equations are commonly used in population biology to model the spread of biological species. In this paper, we propose a fractional reaction-diffusion equation, where the classical second derivative diffusion term is replaced by a fractional derivative of order less than two. Based on the symbolic computation system Mathematica, Adomian decomposition method, developed for fractional differential equations, is directly extended to derive explicit and numerical solutions of space fractional reaction-diffusion equations. The fractional derivative is described in the Caputo sense. Finally, the recent appearance of fractional reaction-diffusion equations as models in some fields such as cell biology, chemistry, physics, and finance, makes it necessary to apply the results reported here to some numerical examples.Keywords: fractional partial differential equations, reaction-diffusion equations, adomian decomposition, biological species
Procedia PDF Downloads 3773717 A Systematic Review of Situational Awareness and Cognitive Load Measurement in Driving
Authors: Aly Elshafei, Daniela Romano
Abstract:
With the development of autonomous vehicles, a human-machine interaction (HMI) system is needed for a safe transition of control when a takeover request (TOR) is required. An important part of the HMI system is the ability to monitor the level of situational awareness (SA) of any driver in real-time, in different scenarios, and without any pre-calibration. Presenting state-of-the-art machine learning models used to measure SA is the purpose of this systematic review. Investigating the limitations of each type of sensor, the gaps, and the most suited sensor and computational model that can be used in driving applications. To the author’s best knowledge this is the first literature review identifying online and offline classification methods used to measure SA, explaining which measurements are subject or session-specific, and how many classifications can be done with each classification model. This information can be very useful for researchers measuring SA to identify the most suited model to measure SA for different applications.Keywords: situational awareness, autonomous driving, gaze metrics, EEG, ECG
Procedia PDF Downloads 1213716 Automated 3D Segmentation System for Detecting Tumor and Its Heterogeneity in Patients with High Grade Ovarian Epithelial Cancer
Authors: Dimitrios Binas, Marianna Konidari, Charis Bourgioti, Lia Angela Moulopoulou, Theodore Economopoulos, George Matsopoulos
Abstract:
High grade ovarian epithelial cancer (OEC) is fatal gynecological cancer and the poor prognosis of this entity is closely related to considerable intratumoral genetic heterogeneity. By examining imaging data, it is possible to assess the heterogeneity of tumorous tissue. This study proposes a methodology for aligning, segmenting and finally visualizing information from various magnetic resonance imaging series in order to construct 3D models of heterogeneity maps from the same tumor in OEC patients. The proposed system may be used as an adjunct digital tool by health professionals for personalized medicine, as it allows for an easy visual assessment of the heterogeneity of the examined tumor.Keywords: image segmentation, ovarian epithelial cancer, quantitative characteristics, image registration, tumor visualization
Procedia PDF Downloads 216