Search results for: target error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4488

Search results for: target error

2118 Schiff Bases of Isatin and Admantane-1-Carbohydrazide: Synthesis, Characterization, and Anticonvulsant Activity

Authors: Hind O. Osman, Tilal Elsaman, Bashir A. Yousef, Esraa Elhadi, Aimun A. E. Ahmed, Eyman Mohamed Eltayib, Malik Suliman Mohamed, Magdi Awadalla Mohamed

Abstract:

Epilepsy is the most common neurological condition and cause of substantial morbidity and mortality. In the present study, the molecular hybridization tool was adopted to obtain six Schiff bases of isatin and adamantane-1-carbohydrazide (18–23). Then, their anticonvulsant activity was evaluated using a pentylenetetrazole- (PTZ-) induced seizure model using phenobarbitone as a positive control. Our findings showed that compounds 18–23 provided significant protection against PTZ-induced seizure, and maximum activities were associated with compound 23. Moreover, all investigated compounds increased the latency of induced convulsion and reduced the duration of epilepsy, with compound 23 being the best. Interestingly, most of the synthesized molecules showed a reduction in neurological symptoms and severity of the seizure. Molecular docking studies suggest GABA-A receptor as a potential target, and in silico ADME screening revealed that the pharmaceutical properties of compound 23 are within the specified limit. Thus, compound 23 was identified as a promising candidate that warrants further drug discovery processes.

Keywords: isatin and adamantane, anticonvulsant activity, PTZ-induced seizure, molecular docking

Procedia PDF Downloads 198
2117 Detecting Earnings Management via Statistical and Neural Networks Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange

Procedia PDF Downloads 416
2116 Primary School Teachers’ Conceptual and Procedural Knowledge of Rational Numbers and Its Effects on Pupils Achievement of Rational Numbers

Authors: Raliatu Mohammed Kashim

Abstract:

The study investigated primary school teachers conceptual and procedural knowledge of rational numbers to determine how it effects on pupil’s achievement on rational number. Specifically, primary school teachers’ level of conceptual and procedural knowledge about rational number and its effects on their pupils understanding of rational number in primary school was explored. The study was carried out in Bauchi state of Nigeria, Using a multistage design. The first stage was a descriptive design. The second stage involves a pre-test post-test only quasi experiment design. The population of the study comprises of six mathematics teachers holding the Nigerian Certificate in Education (NCE) teaching primary six and their two hundred and ten pupils in intact class. Two instrument namely Conceptual and Procedural knowledge Test (CPKT) and Rational number Achievement Test (RAT) were used for data collection. Data collected was analyzed using ANCOVA and Scheffe’s Test. The result revealed a significant differences between pupils taught by teachers with high conceptual and procedural knowledge and those target by teachers with low conceptual and procedural knowledge.

Keywords: conceptual knowledge, procedural knowledge, rational numbers, multistage design

Procedia PDF Downloads 380
2115 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles

Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang

Abstract:

With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.

Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering

Procedia PDF Downloads 125
2114 Geopotential Models Evaluation in Algeria Using Stochastic Method, GPS/Leveling and Topographic Data

Authors: M. A. Meslem

Abstract:

For precise geoid determination, we use a reference field to subtract long and medium wavelength of the gravity field from observations data when we use the remove-compute-restore technique. Therefore, a comparison study between considered models should be made in order to select the optimal reference gravity field to be used. In this context, two recent global geopotential models have been selected to perform this comparison study over Northern Algeria. The Earth Gravitational Model (EGM2008) and the Global Gravity Model (GECO) conceived with a combination of the first model with anomalous potential derived from a GOCE satellite-only global model. Free air gravity anomalies in the area under study have been used to compute residual data using both gravity field models and a Digital Terrain Model (DTM) to subtract the residual terrain effect from the gravity observations. Residual data were used to generate local empirical covariance functions and their fitting to the closed form in order to compare their statistical behaviors according to both cases. Finally, height anomalies were computed from both geopotential models and compared to a set of GPS levelled points on benchmarks using least squares adjustment. The result described in details in this paper regarding these two models has pointed out a slight advantage of GECO global model globally through error degree variances comparison and ground-truth evaluation.

Keywords: quasigeoid, gravity aomalies, covariance, GGM

Procedia PDF Downloads 131
2113 Proposing an Architecture for Drug Response Prediction by Integrating Multiomics Data and Utilizing Graph Transformers

Authors: Nishank Raisinghani

Abstract:

Efficiently predicting drug response remains a challenge in the realm of drug discovery. To address this issue, we propose four model architectures that combine graphical representation with varying positions of multiheaded self-attention mechanisms. By leveraging two types of multi-omics data, transcriptomics and genomics, we create a comprehensive representation of target cells and enable drug response prediction in precision medicine. A majority of our architectures utilize multiple transformer models, one with a graph attention mechanism and the other with a multiheaded self-attention mechanism, to generate latent representations of both drug and omics data, respectively. Our model architectures apply an attention mechanism to both drug and multiomics data, with the goal of procuring more comprehensive latent representations. The latent representations are then concatenated and input into a fully connected network to predict the IC-50 score, a measure of cell drug response. We experiment with all four of these architectures and extract results from all of them. Our study greatly contributes to the future of drug discovery and precision medicine by looking to optimize the time and accuracy of drug response prediction.

Keywords: drug discovery, transformers, graph neural networks, multiomics

Procedia PDF Downloads 143
2112 A Framework of Product Information Service System Using Mobile Image Retrieval and Text Mining Techniques

Authors: Mei-Yi Wu, Shang-Ming Huang

Abstract:

The online shoppers nowadays often search the product information on the Internet using some keywords of products. To use this kind of information searching model, shoppers should have a preliminary understanding about their interesting products and choose the correct keywords. However, if the products are first contact (for example, the worn clothes or backpack of passengers which you do not have any idea about the brands), these products cannot be retrieved due to insufficient information. In this paper, we discuss and study the applications in E-commerce using image retrieval and text mining techniques. We design a reasonable E-commerce application system containing three layers in the architecture to provide users product information. The system can automatically search and retrieval similar images and corresponding web pages on Internet according to the target pictures which taken by users. Then text mining techniques are applied to extract important keywords from these retrieval web pages and search the prices on different online shopping stores with these keywords using a web crawler. Finally, the users can obtain the product information including photos and prices of their favorite products. The experiments shows the efficiency of proposed system.

Keywords: mobile image retrieval, text mining, product information service system, online marketing

Procedia PDF Downloads 352
2111 Improved Image Retrieval for Efficient Localization in Urban Areas Using Location Uncertainty Data

Authors: Mahdi Salarian, Xi Xu, Rashid Ansari

Abstract:

Accurate localization of mobile devices based on camera-acquired visual media information usually requires a search over a very large GPS-referenced image database. This paper proposes an efficient method for limiting the search space for image retrieval engine by extracting and leveraging additional media information about Estimated Positional Error (EP E) to address complexity and accuracy issues in the search, especially to be used for compensating GPS location inaccuracy in dense urban areas. The improved performance is achieved by up to a hundred-fold reduction in the search area used in available reference methods while providing improved accuracy. To test our procedure we created a database by acquiring Google Street View (GSV) images for down town of Chicago. Other available databases are not suitable for our approach due to lack of EP E for the query images. We tested the procedure using more than 200 query images along with EP E acquired mostly in the densest areas of Chicago with different phones and in different conditions such as low illumination and from under rail tracks. The effectiveness of our approach and the effect of size and sector angle of the search area are discussed and experimental results demonstrate how our proposed method can improve performance just by utilizing a data that is available for mobile systems such as smart phones.

Keywords: localization, retrieval, GPS uncertainty, bag of word

Procedia PDF Downloads 282
2110 On the Fourth-Order Hybrid Beta Polynomial Kernels in Kernel Density Estimation

Authors: Benson Ade Eniola Afere

Abstract:

This paper introduces a family of fourth-order hybrid beta polynomial kernels developed for statistical analysis. The assessment of these kernels' performance centers on two critical metrics: asymptotic mean integrated squared error (AMISE) and kernel efficiency. Through the utilization of both simulated and real-world datasets, a comprehensive evaluation was conducted, facilitating a thorough comparison with conventional fourth-order polynomial kernels. The evaluation procedure encompassed the computation of AMISE and efficiency values for both the proposed hybrid kernels and the established classical kernels. The consistently observed trend was the superior performance of the hybrid kernels when compared to their classical counterparts. This trend persisted across diverse datasets, underscoring the resilience and efficacy of the hybrid approach. By leveraging these performance metrics and conducting evaluations on both simulated and real-world data, this study furnishes compelling evidence in favour of the superiority of the proposed hybrid beta polynomial kernels. The discernible enhancement in performance, as indicated by lower AMISE values and higher efficiency scores, strongly suggests that the proposed kernels offer heightened suitability for statistical analysis tasks when compared to traditional kernels.

Keywords: AMISE, efficiency, fourth-order Kernels, hybrid Kernels, Kernel density estimation

Procedia PDF Downloads 65
2109 Radial Distortion Correction Based on the Concept of Verifying the Planarity of a Specimen

Authors: Shih-Heng Tung, Ming-Hsiang Shih, Wen-Pei Sung

Abstract:

Because of the rapid development of digital camera and computer, digital image correlation method has drawn lots of attention recently and has been applied to a variety of fields. However, the image distortion is inevitable when the image is captured through a lens. This image distortion problem can result in an innegligible error while using digital image correlation method. There are already many different ways to correct the image distortion, and most of them require specific image patterns or precise control points. A new distortion correction method is proposed in this study. The proposed method is based on the fact that a flat surface should keep flat when it is measured using three-dimensional (3D) digital image measurement technique. Lens distortion can be divided into radial distortion, decentering distortion and thin prism distortion. Because radial distortion has a more noticeable influence than the other types of distortions, this method deals only with radial distortion. The simplified 3D digital image measurement technique is adopted to measure the surface coordinates of a flat specimen. Then the gradient method is applied to find the best correction parameters. A few experiments are carried out in this study to verify the correctness of this method. The results show that this method can achieve a good accuracy and it is suitable for both large and small distortion conditions. The most important advantage is that it requires neither mark with specific pattern nor precise control points.

Keywords: 3D DIC, radial distortion, distortion correction, planarity

Procedia PDF Downloads 547
2108 Perusing the Influence of a Visual Editor in Enabling PostgreSQL Query Learn-Ability

Authors: Manuela Nayantara Jeyaraj

Abstract:

PostgreSQL is an Object-Relational Database Management System (ORDBMS) with an architecture that ensures optimal quality data management. But due to the shading growth of similar ORDBMS, PostgreSQL has not been renowned among the database user community. Despite having its features and in-built functionalities shadowed, PostgreSQL renders a vast range of utilities for data manipulation and hence calling for it to be upheld more among users. But introducing PostgreSQL in order to stimulate its advantageous features among users, mandates endorsing learn-ability as an add-on as the target groups considered consist of both amateur as well as professional PostgreSQL users. The scope of this paper deliberates providing easy contemplation of query formulations and flows through a visual editor designed according to user interface principles that standby to support every aspect of making PostgreSQL learn-able by self-operation and creation of queries within the visual editor. This paper tends to scrutinize the importance of choosing PostgreSQL as the working database environment, the visual perspectives that influence human behaviour and ultimately learning, the modes in which learn-ability can be provided via visualization and the advantages reaped by the implementation of the proposed system features.

Keywords: database, learn-ability, PostgreSQL, query, visual-editor

Procedia PDF Downloads 168
2107 An Improved Method on Static Binary Analysis to Enhance the Context-Sensitive CFI

Authors: Qintao Shen, Lei Luo, Jun Ma, Jie Yu, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Control Flow Integrity (CFI) is one of the most promising technique to defend Code-Reuse Attacks (CRAs). Traditional CFI Systems and recent Context-Sensitive CFI use coarse control flow graphs (CFGs) to analyze whether the control flow hijack occurs, left vast space for attackers at indirect call-sites. Coarse CFGs make it difficult to decide which target to execute at indirect control-flow transfers, and weaken the existing CFI systems actually. It is an unsolved problem to extract CFGs precisely and perfectly from binaries now. In this paper, we present an algorithm to get a more precise CFG from binaries. Parameters are analyzed at indirect call-sites and functions firstly. By comparing counts of parameters prepared before call-sites and consumed by functions, targets of indirect calls are reduced. Then the control flow would be more constrained at indirect call-sites in runtime. Combined with CCFI, we implement our policy. Experimental results on some popular programs show that our approach is efficient. Further analysis show that it can mitigate COOP and other advanced attacks.

Keywords: contex-sensitive, CFI, binary analysis, code reuse attack

Procedia PDF Downloads 315
2106 Effects of Using Clinical Guidelines for Feeding through a Gastrostomy Tube in Critically ill Surgical Patients Songkla Hospital Thailand

Authors: Siriporn Sikkaphun

Abstract:

Food is essential for living, and receiving correct, suitable, and adequate food is advantageous to the body, especially for patients because it can enable good recovery. Feeding through a gastrostomy tube is one useful way that is widely used because it is easy, convenient, and economical.To compare the effectiveness of using the clinical guidelines for feeding through a gastrostomy tube in critically ill surgical patients.This is a pre-post quasi-experimental study on 15 critically ill surgical or accident patients who needed intubation and the gastrostomy tube from August 2011 to November 2012. The data were collected using the guidelines, and an evaluation form for effectiveness of guidelines for feeding through a gastrostomy tube in critically ill surgical patients. After using the guidelines for feeding through a gastrostomy tube in critically ill surgical patients, it was found that The average number of days from the admission date to the day the patients received food through the G-tube significantly reduced at the level .05. The number of personnel who practiced nursing activities correctly and suitably for patients with complications during feeding significantly increased at the level .05.The number of patients receiving energy to the target level significantly increased at the level .05. The results of this study indicated that the use of the guidelines for feeding through a gastrostomy tube in critically ill surgical patients was feasible in practice, and the outcomes were beneficial to the patients.

Keywords: clinical guidelines, feeding, gastrostomy tube, critically ill, surgical patients

Procedia PDF Downloads 317
2105 Evaluation of Spatial Distribution Prediction for Site-Scale Soil Contaminants Based on Partition Interpolation

Authors: Pengwei Qiao, Sucai Yang, Wenxia Wei

Abstract:

Soil pollution has become an important issue in China. Accurate spatial distribution prediction of pollutants with interpolation methods is the basis for soil remediation in the site. However, a relatively strong variability of pollutants would decrease the prediction accuracy. Theoretically, partition interpolation can result in accurate prediction results. In order to verify the applicability of partition interpolation for a site, benzo (b) fluoranthene (BbF) in four soil layers was adopted as the research object in this paper. IDW (inverse distance weighting)-, RBF (radial basis function)-and OK (ordinary kriging)-based partition interpolation accuracies were evaluated, and their influential factors were analyzed; then, the uncertainty and applicability of partition interpolation were determined. Three conclusions were drawn. (1) The prediction error of partitioned interpolation decreased by 70% compared to unpartitioned interpolation. (2) Partition interpolation reduced the impact of high CV (coefficient of variation) and high concentration value on the prediction accuracy. (3) The prediction accuracy of IDW-based partition interpolation was higher than that of RBF- and OK-based partition interpolation, and it was suitable for the identification of highly polluted areas at a contaminated site. These results provide a useful method to obtain relatively accurate spatial distribution information of pollutants and to identify highly polluted areas, which is important for soil pollution remediation in the site.

Keywords: accuracy, applicability, partition interpolation, site, soil pollution, uncertainty

Procedia PDF Downloads 139
2104 Examining Renewable Energy Policy Implementation for Sustainable Development in Kenya

Authors: Eliud Kiprop, Kenichi Matsui, Joseph Karanja, Hesborn Ondiba

Abstract:

To double the share of renewable energy in the global energy mix by 2030 as part of actions for the Paris Agreement, policymakers in each ratifying country must accelerate their efforts within the next few years by implementing their own renewable energy strategies. Kenya has increased its funding for research and development in renewable energy sources largely because it intends to reduce greenhouse gas GHG emissions by 30% from business as usual (BAU) levels (143 MtCO₂eq) by 2030. In 2013, the Kenyan government launched an ambitious plan to increase the installed power generation capacity from 1,768MW to more than 5,000MW by the end of 2017. This paper examines the formulation and implementation process of this plan and shows how this plan will affect Kenya’s renewable energy industry and national policy implementation in general. Results demonstrate that, despite having a well- documented policy in place, the Kenyan government cannot meet its target of 5000MW by the end of 2017. Among other factors, we find that the main reason is attributable to the failure in adhering to the main principles of the policy plan. We also find that the government has failed to consider the future energy demand. Had the policy been implemented on time, we argue that there would have been excess power.

Keywords: policy implementation, policy plan, renewable energy, sustainable development

Procedia PDF Downloads 211
2103 An Approach for Detection Efficiency Determination of High Purity Germanium Detector Using Cesium-137

Authors: Abdulsalam M. Alhawsawi

Abstract:

Estimation of a radiation detector's efficiency plays a significant role in calculating the activity of radioactive samples. Detector efficiency is measured using sources that emit a variety of energies from low to high-energy photons along the energy spectrum. Some photon energies are hard to find in lab settings either because check sources are hard to obtain or the sources have short half-lives. This work aims to develop a method to determine the efficiency of a High Purity Germanium Detector (HPGe) based on the 662 keV gamma ray photon emitted from Cs-137. Cesium-137 is readily available in most labs with radiation detection and health physics applications and has a long half-life of ~30 years. Several photon efficiencies were calculated using the MCNP5 simulation code. The simulated efficiency of the 662 keV photon was used as a base to calculate other photon efficiencies in a point source and a Marinelli Beaker form. In the Marinelli Beaker filled with water case, the efficiency of the 59 keV low energy photons from Am-241 was estimated with a 9% error compared to the MCNP5 simulated efficiency. The 1.17 and 1.33 MeV high energy photons emitted by Co-60 had errors of 4% and 5%, respectively. The estimated errors are considered acceptable in calculating the activity of unknown samples as they fall within the 95% confidence level.

Keywords: MCNP5, MonteCarlo simulations, efficiency calculation, absolute efficiency, activity estimation, Cs-137

Procedia PDF Downloads 113
2102 Visual and Chemical Servoing of a Hexapod Robot in a Confined Environment Using Jacobian Estimator

Authors: Guillaume Morin-Duponchelle, Ahmed Nait Chabane, Benoit Zerr, Pierre Schoesetters

Abstract:

Industrial inspection can be achieved through robotic systems, allowing visual and chemical servoing. A popular scheme for visual servo-controlled robotic is the image-based servoing sys-tems. In this paper, an approach of visual and chemical servoing of a hexapod robot using a visual and chemical Jacobian matrix are proposed. The basic idea behind the visual Jacobian matrix is modeling the differential relationship between the camera system and the robotic control system to detect and track accurately points of interest in confined environments. This approach allows the robot to easily detect and navigates to the QR code or seeks a gas source localization using surge cast algorithm. To track the QR code target, a visual servoing based on Jacobian matrix is used. For chemical servoing, three gas sensors are embedded on the hexapod. A Jacobian matrix applied to the gas concentration measurements allows estimating the direction of the main gas source. The effectiveness of the proposed scheme is first demonstrated on simulation. Finally, a hexapod prototype is designed and built and the experimental validation of the approach is presented and discussed.

Keywords: chemical servoing, hexapod robot, Jacobian matrix, visual servoing, navigation

Procedia PDF Downloads 122
2101 Design and Thermal Simulation Analysis of the Chinese Accelerator Driven Sub-Critical System Injector-I Cryomodule

Authors: Rui-Xiong Han, Rui Ge, Shao-Peng Li, Lin Bian, Liang-Rui Sun, Min-Jing Sang, Rui Ye, Ya-Ping Liu, Xiang-Zhen Zhang, Jie-Hao Zhang, Zhuo Zhang, Jian-Qing Zhang, Miao-Fu Xu

Abstract:

The Chinese Accelerator Driven Sub-critical system (C-ADS) uses a high-energy proton beam to bombard the metal target and generate neutrons to deal with the nuclear waste. The Chinese ADS proton linear has two 0~10 MeV injectors and one 10~1500 MeV superconducting linac. Injector-I is studied by the Institute of High Energy Physics (IHEP) under construction in the Beijing, China. The linear accelerator consists of two accelerating cryomodules operating at the temperature of 2 Kelvin. This paper describes the structure and thermal performances analysis of the cryomodule. The analysis takes into account all the main contributors (support posts, multilayer insulation, current leads, power couplers, and cavities) to the static and dynamic heat load at various cryogenic temperature levels. The thermal simulation analysis of the cryomodule is important theory foundation of optimization and commissioning.

Keywords: C-ADS, cryomodule, structure, thermal simulation, static heat load, dynamic heat load

Procedia PDF Downloads 392
2100 Improving the Run Times of Existing and Historical Demand Models Using Simple Python Scripting

Authors: Abhijeet Ostawal, Parmjit Lall

Abstract:

The run times for a large strategic model that we were managing had become too long leading to delays in project delivery, increased costs and loss in productivity. Software developers are continuously working towards developing more efficient tools by changing their algorithms and processes. The issue faced by our team was how do you apply the latest technologies on validated existing models which are based on much older versions of software that do not have the latest software capabilities. The multi-model transport model that we had could only be run in sequential assignment order. Recent upgrades to the software now allowed the assignment to be run in parallel, a concept called parallelization. Parallelization is a Python script working only within the latest version of the software. A full model transfer to the latest version was not possible due to time, budget and the potential changes in trip assignment. This article is to show the method to adapt and update the Python script in such a way that it can be used in older software versions by calling the latest version and then recalling the old version for assignment model without affecting the results. Through a process of trial-and-error run time savings of up to 30-40% have been achieved. Assignment results were maintained within the older version and through this learning process we’ve applied this methodology to other even older versions of the software resulting in huge time savings, more productivity and efficiency for both client and consultant.

Keywords: model run time, demand model, parallelisation, python scripting

Procedia PDF Downloads 111
2099 A Comparison of the First Language Vocabulary Used by Indonesian Year 4 Students and the Vocabulary Taught to Them in English Language Textbooks

Authors: Fitria Ningsih

Abstract:

This study concerns on the process of making corpus obtained from Indonesian year 4 students’ free writing compared to the vocabulary taught in English language textbooks. 369 students’ sample writings from 19 public elementary schools in Malang, East Java, Indonesia and 5 selected English textbooks were analyzed through corpus in linguistics method using AdTAT -the Adelaide Text Analysis Tool- program. The findings produced wordlists of the top 100 words most frequently used by students and the top 100 words given in English textbooks. There was a 45% match between the two lists. Furthermore, the classifications of the top 100 most frequent words from the two corpora based on part of speech found that both the Indonesian and English languages employed a similar use of nouns, verbs, adjectives, and prepositions. Moreover, to see the contextualizing the vocabulary of learning materials towards the students’ need, a depth-analysis dealing with the content and the cultural views from the vocabulary taught in the textbooks was discussed through the criteria developed from the checklist. Lastly, further suggestions are addressed to language teachers to understand the students’ background such as recognizing the basic words students acquire before teaching them new vocabulary in order to achieve successful learning of the target language.

Keywords: corpus, frequency, English, Indonesian, linguistics, textbooks, vocabulary, wordlists, writing

Procedia PDF Downloads 180
2098 An Optimal Control Model to Determine Body Forces of Stokes Flow

Authors: Yuanhao Gao, Pin Lin, Kees Weijer

Abstract:

In this paper, we will determine the external body force distribution with analysis of stokes fluid motion using mathematical modelling and numerical approaching. The body force distribution is regarded as the unknown variable and could be determined by the idea of optimal control theory. The Stokes flow motion and its velocity are generated by given forces in a unit square domain. A regularized objective functional is built to match the numerical result of flow velocity with the generated velocity data. So that the force distribution could be determined by minimizing the value of objective functional, which is also the difference between the numerical and experimental velocity. Then after utilizing the Lagrange multiplier method, some partial differential equations are formulated consisting the optimal control system to solve. Finite element method and conjugate gradient method are used to discretize equations and deduce the iterative expression of target body force to compute the velocity numerically and body force distribution. Programming environment FreeFEM++ supports the implementation of this model.

Keywords: optimal control model, Stokes equation, finite element method, conjugate gradient method

Procedia PDF Downloads 397
2097 Performance Evaluation of Wideband Code Division Multiplication Network

Authors: Osama Abdallah Mohammed Enan, Amin Babiker A/Nabi Mustafa

Abstract:

The aim of this study is to evaluate and analyze different parameters of WCDMA (wideband code division multiplication). Moreover, this study also incorporates brief yet throughout analysis of WCDMA’s components as well as its internal architecture. This study also examines different power controls. These power controls may include open loop power control, closed or inner group loop power control and outer loop power control. Different handover techniques or methods of WCDMA are also illustrated in this study. These handovers may include hard handover, inter system handover and soft and softer handover. Different duplexing techniques are also described in the paper. This study has also presented an idea about different parameters of WCDMA that leads the system towards QoS issues. This may help the operator in designing and developing adequate network configuration. In addition to this, the study has also investigated various parameters including Bit Energy per Noise Spectral Density (Eb/No), Noise rise, and Bit Error Rate (BER). After simulating these parameters, using MATLAB environment, it was investigated that, for a given Eb/No value the system capacity increase by increasing the reuse factor. Besides that, it was also analyzed that, noise rise is decreasing for lower data rates and for lower interference levels. Finally, it was examined that, BER increase by using one type of modulation technique than using other type of modulation technique.

Keywords: duplexing, handover, loop power control, WCDMA

Procedia PDF Downloads 209
2096 Establishing Econometric Modeling Equations for Lumpy Skin Disease Outbreaks in the Nile Delta of Egypt under Current Climate Conditions

Authors: Abdelgawad, Salah El-Tahawy

Abstract:

This paper aimed to establish econometrical equation models for the Nile delta region in Egypt, which will represent a basement for future predictions of Lumpy skin disease outbreaks and its pathway in relation to climate change. Data of lumpy skin disease (LSD) outbreaks were collected from the cattle farms located in the provinces representing the Nile delta region during 1 January, 2015 to December, 2015. The obtained results indicated that there was a significant association between the degree of the LSD outbreaks and the investigated climate factors (temperature, wind speed, and humidity) and the outbreaks peaked during the months of June, July, and August and gradually decreased to the lowest rate in January, February, and December. The model obtained depicted that the increment of these climate factors were associated with evidently increment on LSD outbreaks on the Nile Delta of Egypt. The model validation process was done by the root mean square error (RMSE) and means bias (MB) which compared the number of LSD outbreaks expected with the number of observed outbreaks and estimated the confidence level of the model. The value of RMSE was 1.38% and MB was 99.50% confirming that this established model described the current association between the LSD outbreaks and the change on climate factors and also can be used as a base for predicting the of LSD outbreaks depending on the climatic change on the future.

Keywords: LSD, climate factors, Nile delta, modeling

Procedia PDF Downloads 281
2095 Substitution of Formaldehyde in Phenolic Resins with Innovative and Bio-Based Vanillin Derived Compounds

Authors: Sylvain Caillol, Ghislain David

Abstract:

Phenolic resins are industrially used in a wide range of applications from commodity and construction materials to high-technology aerospace industry. They are mainly produced from the reaction between phenolic compounds and formaldehyde. Nevertheless, formaldehyde is a highly volatile and hazardous compound, classified as a Carcinogenic, Mutagenic and Reprotoxic chemical (CMR). Vanillin is a bio-based and non-toxic aromatic aldehyde compound obtained from the abundant lignin resources. Also, its aromaticity is very interesting for the synthesis of phenolic resins with high thermal stability. However, because of the relatively low reactivity of its aldehyde function toward phenolic compounds, it has never been used to synthesize phenolic resins. We developed innovative functionalization reactions and designed new bio-based aromatic aldehyde compounds from vanillin. Those innovative compounds present improved reactivity toward phenolic compounds compared to vanillin. Moreover, they have target structures to synthesize highly cross-linked phenolic resins with high aromatic densities. We have obtained phenolic resins from substituted vanillin, thus without the use of any aldehyde compound classified as CMR. The analytical tests of the cured resins confirmed that those bio-based resins exhibit high levels of performance with high thermal stability and high rigidity properties

Keywords: phenolic resins, formaldehyde-free, vanillin, bio-based, non-toxic

Procedia PDF Downloads 268
2094 Evolution of Design through Documentation of Architecture Design Processes

Authors: Maniyarasan Rajendran

Abstract:

Every design has a process, and every architect deals in the ways best known to them. The design translation from the concept to completion change in accordance with their design philosophies, their tools, availability of resources, and at times the clients and the context of the design as well. The approach to understanding the design process requires formalisation of the design intents. The design process is characterised by change, with the time and the technology. The design flow is just indicative and never exhaustive. The knowledge and experience of stakeholders remain limited to the part they played in the project, and their ability to remember, and is through the Photographs. These artefacts, when circulated can hardly tell what the project is. They can never tell the narrative behind. In due course, the design processes are lost. The Design junctions are lost in the journey. Photographs acted as major source materials, along with its importance in architectural revivalism in the 19th century. From the history, we understand that it has been photographs, that act as the dominant source of evidence. The idea of recording is also followed with the idea of getting inspired from the records and documents. The design concept, the architectural firms’ philosophies, the materials used, the special needs, the numerous ‘Trial-and-error’ methods, design methodology, experience of failures and success levels, and the knowledge acquired, etc., and the various other aspects and methods go through in every project, and they deserve/ought to be recorded. The knowledge can be preserved and passed through generations, by documenting the design processes involved. This paper explores the idea of a process documentation as a tool of self-reflection, creation of architectural firm’ repository, and these implications proceed with the design evolution of the team.

Keywords: architecture, design, documentation, records

Procedia PDF Downloads 361
2093 The X-Ray Response Team: Building a National Health Pre-Hospital Service

Authors: Julian Donovan, Jessica Brealey, Matthew Bowker, Marianne Feghali, Gregory Smith, Lee Thompson, Deborah Henderson

Abstract:

This article details the development of the X-ray response team (XRT), a service that utilises innovative technology to safely deliver acute and elective imaging and medical assessment service in the pre-hospital and community setting. This involves a partnership between Northumbria Healthcare NHS Foundation Trust’s Radiology and Emergency Medicine departments and the North East Ambulance Service to create a multidisciplinary prehospital team. The team committed to the delivery of a two-day acute service every week, alongside elective referrals, starting in November 2020. The service was originally made available to a 15-mile radius surrounding the Northumbria Hospital. Due to demand, this was expanded to include the North Tyneside and Northumberland regions. The target population was specified as frail and vulnerable patients, as well as those deemed to benefit from staying in their own environment. Within the first two months, thirty-six percent of patients assessed were able to stay at home due to the provision of off-site imaging. In the future, this service aims to allow patient transfer directly to an appropriate ward or clinic, bypassing the emergency department to improve the patient journey and reduce emergency care pressures.

Keywords: frailty, imaging, pre-hospital, X-ray

Procedia PDF Downloads 196
2092 The Perspective on Data Collection Instruments for Younger Learners

Authors: Hatice Kübra Koç

Abstract:

For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.

Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners

Procedia PDF Downloads 84
2091 Explainable Graph Attention Networks

Authors: David Pham, Yongfeng Zhang

Abstract:

Graphs are an important structure for data storage and computation. Recent years have seen the success of deep learning on graphs such as Graph Neural Networks (GNN) on various data mining and machine learning tasks. However, most of the deep learning models on graphs cannot easily explain their predictions and are thus often labelled as “black boxes.” For example, Graph Attention Network (GAT) is a frequently used GNN architecture, which adopts an attention mechanism to carefully select the neighborhood nodes for message passing and aggregation. However, it is difficult to explain why certain neighbors are selected while others are not and how the selected neighbors contribute to the final classification result. In this paper, we present a graph learning model called Explainable Graph Attention Network (XGAT), which integrates graph attention modeling and explainability. We use a single model to target both the accuracy and explainability of problem spaces and show that in the context of graph attention modeling, we can design a unified neighborhood selection strategy that selects appropriate neighbor nodes for both better accuracy and enhanced explainability. To justify this, we conduct extensive experiments to better understand the behavior of our model under different conditions and show an increase in both accuracy and explainability.

Keywords: explainable AI, graph attention network, graph neural network, node classification

Procedia PDF Downloads 181
2090 Quantification of Extent of Pollution from Total Lead in the Shooting Ranges Found in Southern and Central Botswana: A Pioneering Study

Authors: Nicholas Sehube, Rosemary Kelebemang, Pogisego Dinake

Abstract:

The extent of Pb contamination of shooting range soils has never been ascertained in Botswana, this was the first attempt in evaluating the deposition of Pb into the soils emanating from munitions. A total of 8 military shooting ranges were used for this study. Soil samples were collected at each of the 8 shooting ranges at the berm (stop butt), target line, 50 and 100 m from the berm. In all of the shooting ranges investigated the highest concentrations were found in the berm soils. The highest Pb concentrations of 38 406.87 mg/Kg were found in the berm soils of Thebephatshwa shooting range which is enclosed within a military camp with staff residential dwelling only a kilometre away. Most of the shooting ranges soils contained elevated levels of Pb in the ranges above 2000 mg/kg far exceeding the United States Environmental Protection Agency (USEPA) critical value of 400 mg/Kg. Mobilization of lead at high pH is attributed to low organic matter and such was the case with Thebephatshwa shooting range with a percept organic matter of 0.35±0.08. The predominant weathering products in these shooting ranges were cerussite (PbCO3), hydrocerussite (Pb(CO3)2(OH)2 and massicot (PbO). The detailed examination and characterization of the extent of pollution will help in the development and implementation of scientifically sound remediation and restoration of shooting ranges soils.

Keywords: ammunition, Botswana, Pb, pollution, soil

Procedia PDF Downloads 226
2089 Computational Prediction of the Effect of S477N Mutation on the RBD Binding Affinity and Structural Characteristic, A Molecular Dynamics Study

Authors: Mohammad Hossein Modarressi, Mozhgan Mondeali, Khabat Barkhordari, Ali Etemadi

Abstract:

The COVID-19 pandemic, caused by SARS-CoV-2, has led to significant concerns worldwide due to its catastrophic effects on public health. The SARS-CoV-2 infection is initiated with the binding of the receptor-binding domain (RBD) in its spike protein to the ACE2 receptor in the host cell membrane. Due to the error-prone entity of the viral RNA-dependent polymerase complex, the virus genome, including the coding region for the RBD, acquires new mutations, leading to the appearance of multiple variants. These variants can potentially impact transmission, virulence, antigenicity and evasive immune properties. S477N mutation located in the RBD has been observed in the SARS-CoV-2 omicron (B.1.1. 529) variant. In this study, we investigated the consequences of S477N mutation at the molecular level using computational approaches such as molecular dynamics simulation, protein-protein interaction analysis, immunoinformatics and free energy computation. We showed that displacement of Ser with Asn increases the stability of the spike protein and its affinity to ACE2 and thus increases the transmission potential of the virus. This mutation changes the folding and secondary structure of the spike protein. Also, it reduces antibody neutralization, raising concern about re-infection, vaccine breakthrough and therapeutic values.

Keywords: S477N, COVID-19, molecular dynamic, SARS-COV2 mutations

Procedia PDF Downloads 166