Search results for: information value method
23561 Deasphalting of Crude Oil by Extraction Method
Authors: A. N. Kurbanova, G. K. Sugurbekova, N. K. Akhmetov
Abstract:
The asphaltenes are heavy fraction of crude oil. Asphaltenes on oilfield is known for its ability to plug wells, surface equipment and pores of the geologic formations. The present research is devoted to the deasphalting of crude oil as the initial stage refining oil. Solvent deasphalting was conducted by extraction with organic solvents (cyclohexane, carbon tetrachloride, chloroform). Analysis of availability of metals was conducted by ICP-MS and spectral feature at deasphalting was achieved by FTIR. High contents of asphaltenes in crude oil reduce the efficiency of refining processes. Moreover, high distribution heteroatoms (e.g., S, N) were also suggested in asphaltenes cause some problems: environmental pollution, corrosion and poisoning of the catalyst. The main objective of this work is to study the effect of deasphalting process crude oil to improve its properties and improving the efficiency of recycling processes. Experiments of solvent extraction are using organic solvents held in the crude oil JSC “Pavlodar Oil Chemistry Refinery. Experimental results show that deasphalting process also leads to decrease Ni, V in the composition of the oil. One solution to the problem of cleaning oils from metals, hydrogen sulfide and mercaptan is absorption with chemical reagents directly in oil residue and production due to the fact that asphalt and resinous substance degrade operational properties of oils and reduce the effectiveness of selective refining of oils. Deasphalting of crude oil is necessary to separate the light fraction from heavy metallic asphaltenes part of crude oil. For this oil is pretreated deasphalting, because asphaltenes tend to form coke or consume large quantities of hydrogen. Removing asphaltenes leads to partly demetallization, i.e. for removal of asphaltenes V/Ni and organic compounds with heteroatoms. Intramolecular complexes are relatively well researched on the example of porphyinous complex (VO2) and nickel (Ni). As a result of studies of V/Ni by ICP MS method were determined the effect of different solvents-deasphalting – on the process of extracting metals on deasphalting stage and select the best organic solvent. Thus, as the best DAO proved cyclohexane (C6H12), which as a result of ICP MS retrieves V-51.2%, Ni-66.4%? Also in this paper presents the results of a study of physical and chemical properties and spectral characteristics of oil on FTIR with a view to establishing its hydrocarbon composition. Obtained by using IR-spectroscopy method information about the specifics of the whole oil give provisional physical, chemical characteristics. They can be useful in the consideration of issues of origin and geochemical conditions of accumulation of oil, as well as some technological challenges. Systematic analysis carried out in this study; improve our understanding of the stability mechanism of asphaltenes. The role of deasphalted crude oil fractions on the stability asphaltene is described.Keywords: asphaltenes, deasphalting, extraction, vanadium, nickel, metalloporphyrins, ICP-MS, IR spectroscopy
Procedia PDF Downloads 24223560 Finding Out the Best Place for Resettling of Victims after the Earthquake: A Case Study for Tehran, Iran
Authors: Reyhaneh Saeedi, Nima Ghasemloo
Abstract:
Iran is a capable zone for earthquake that follows loss of lives and financial damages. To have sheltering for earthquake victims is one of the basic requirements although it is hard to select suitable places for temporary resettling after an earthquake happens. Before these kinds of disasters happen, the best places for resettling the victims must be designated. This matter is an important issue in disaster management and planning. Geospatial Information System (GIS) has a determining role in disaster management; it can determine the best places for temporary resettling after such a disaster. In this paper the best criteria have been determined associated with their weights and buffers by use of research and questionnaire for locating the best places. In this paper, AHP method is used as decision model and to locate the best places for temporary resettling is done based on the selected criteria. Also in this research are made the buffer layers of criteria and change them to the raster layers. Later on, the raster layers are multiplied on desired weights then, the results are added together. Finally there are suitable places for resettling of victims by desired criteria by different colors with their optimum rate in QGIS software.Keywords: disaster management, temporary resettlement, earthquake, criteria
Procedia PDF Downloads 46423559 Clinicians' and Nurses' Documentation Practices in Palliative and Hospice Care: A Mixed Methods Study Providing Evidence for Quality Improvement at Mobile Hospice Mbarara, Uganda
Authors: G. Natuhwera, M. Rabwoni, P. Ellis, A. Merriman
Abstract:
Aims: Health workers are likely to document patients’ care inaccurately, especially when using new and revised case tools, and this could negatively impact patient care. This study set out to; (1) assess nurses’ and clinicians’ documentation practices when using a new patients’ continuation case sheet (PCCS) and (2) explore nurses’ and clinicians’ experiences regarding documentation of patients’ information in the new PCCS. The purpose of introducing the PCCS was to improve continuity of care for patients attending clinics at which they were unlikely to see the same clinician or nurse consistently. Methods: This was a mixed methods study. The cross-sectional inquiry retrospectively reviewed 100 case notes of active patients on hospice and palliative care program. Data was collected using a structured questionnaire with constructs formulated from the new PCCS under study. The qualitative element was face-to-face audio-recorded, open-ended interviews with a purposive sample of one palliative care clinician, and four palliative care nurse specialists. Thematic analysis was used. Results: Missing patients’ biogeographic information was prevalent at 5-10%. Spiritual and psychosocial issues were not documented in 42.6%, and vital signs in 49.2%. Poorest documentation practices were observed in past medical history part of the PCCS at 40-63%. Four themes emerged from interviews with clinicians and nurses-; (1) what remains unclear and challenges, (2) comparing the past with the present, (3) experiential thoughts, and (4) transition and adapting to change. Conclusions: The PCCS seems to be a comprehensive and simple tool to be used to document patients’ information at subsequent visits. The comprehensiveness and utility of the PCCS does paper to be limited by the failure to train staff in its use prior to introducing. The authors find the PCCS comprehensive and suitable to capture patients’ information and recommend it can be adopted and used in other palliative and hospice care settings, if suitable introductory training accompanies its introduction. Otherwise, the reliability and validity of patients’ information collected by this PCCS can be significantly reduced if some sections therein are unclear to the clinicians/nurses. The study identified clinicians- and nurses-related pitfalls in documentation of patients’ care. Clinicians and nurses need to prioritize accurate and complete documentation of patient care in the PCCS for quality care provision. This study should be extended to other sites using similar tools to ensure representative and generalizable findings.Keywords: documentation, information case sheet, palliative care, quality improvement
Procedia PDF Downloads 15123558 Micro- and Nanoparticle Transport and Deposition in Elliptic Obstructed Channels by Lattice Boltzmann Method
Authors: Salman Piri
Abstract:
In this study, a two-dimensional lattice Boltzmann method (LBM) was considered for the numerical simulation of fluid flow in a channel. Also, the Lagrangian method was used for particle tracking in one-way coupling. Three hundred spherical particles with specific diameters were released in the channel entry and an elliptical object was placed in the channel for flow obstruction. The effect of gravity, the drag force, the Saffman lift and the Brownian forces were evaluated in the particle motion trajectories. Also, the effect of the geometrical parameter, ellipse aspect ratio, and the flow characteristic or Reynolds number was surveyed for the transport and deposition of particles. Moreover, the influence of particle diameter between 0.01 and 10 µm was investigated. Results indicated that in small Reynolds, more inertial and gravitational trapping occurred on the obstacle surface for particles with larger diameters. Whereas, for nano-particles, influenced by Brownian diffusion and vortices behind the obstacle, the inertial and gravitational mechanisms were insignificant and diffusion was the dominant deposition mechanism. In addition, in Reynolds numbers larger than 400, there was no significant difference between the deposition of finer and larger particles. Also, in higher aspect ratios of the ellipse, more inertial trapping occurred for particles of larger diameter (10 micrometers), while in lower cases, interception and gravitational mechanisms were dominant.Keywords: ellipse aspect elito, particle tracking diffusion, lattice boltzman method, larangain particle tracking
Procedia PDF Downloads 7923557 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint
Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar
Abstract:
Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine
Procedia PDF Downloads 8223556 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals
Authors: Bahareh Ansari
Abstract:
Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.Keywords: best practices, data visualization, literature review, open government data
Procedia PDF Downloads 10623555 Determination of Potential Agricultural Lands Using Landsat 8 OLI Images and GIS: Case Study of Gokceada (Imroz) Turkey
Authors: Rahmi Kafadar, Levent Genc
Abstract:
In present study, it was aimed to determine potential agricultural lands (PALs) in Gokceada (Imroz) Island of Canakkale province, Turkey. Seven-band Landsat 8 OLI images acquired on July 12 and August 13, 2013, and their 14-band combination image were used to identify current Land Use Land Cover (LULC) status. Principal Component Analysis (PCA) was applied to three Landsat datasets in order to reduce the correlation between the bands. A total of six Original and PCA images were classified using supervised classification method to obtain the LULC maps including 6 main classes (“Forest”, “Agriculture”, “Water Surface”, “Residential Area-Bare Soil”, “Reforestation” and “Other”). Accuracy assessment was performed by checking the accuracy of 120 randomized points for each LULC maps. The best overall accuracy and Kappa statistic values (90.83%, 0.8791% respectively) were found for PCA images which were generated from 14-bands combined images called 3-B/JA. Digital Elevation Model (DEM) with 15 m spatial resolution (ASTER) was used to consider topographical characteristics. Soil properties were obtained by digitizing 1:25000 scaled soil maps of rural services directorate general. Potential Agricultural Lands (PALs) were determined using Geographic information Systems (GIS). Procedure was applied considering that “Other” class of LULC map may be used for agricultural purposes in the future properties. Overlaying analysis was conducted using Slope (S), Land Use Capability Class (LUCC), Other Soil Properties (OSP) and Land Use Capability Sub-Class (SUBC) properties. A total of 901.62 ha areas within “Other” class (15798.2 ha) of LULC map were determined as PALs. These lands were ranked as “Very Suitable”, “Suitable”, “Moderate Suitable” and “Low Suitable”. It was determined that the 8.03 ha were classified as “Very Suitable” while 18.59 ha as suitable and 11.44 ha as “Moderate Suitable” for PALs. In addition, 756.56 ha were found to be “Low Suitable”. The results obtained from this preliminary study can serve as basis for further studies.Keywords: digital elevation model (DEM), geographic information systems (GIS), gokceada (Imroz), lANDSAT 8 OLI-TIRS, land use land cover (LULC)
Procedia PDF Downloads 35323554 Design and Analysis of a Piezoelectric-Based AC Current Measuring Sensor
Authors: Easa Ali Abbasi, Akbar Allahverdizadeh, Reza Jahangiri, Behnam Dadashzadeh
Abstract:
Electrical current measurement is a suitable method for the performance determination of electrical devices. There are two contact and noncontact methods in this measuring process. Contact method has some disadvantages like having direct connection with wire which may endamage the system. Thus, in this paper, a bimorph piezoelectric cantilever beam which has a permanent magnet on its free end is used to measure electrical current in a noncontact way. In mathematical modeling, based on Galerkin method, the governing equation of the cantilever beam is solved, and the equation presenting the relation between applied force and beam’s output voltage is presented. Magnetic force resulting from current carrying wire is considered as the external excitation force of the system. The results are compared with other references in order to demonstrate the accuracy of the mathematical model. Finally, the effects of geometric parameters on the output voltage and natural frequency are presented.Keywords: cantilever beam, electrical current measurement, forced excitation, piezoelectric
Procedia PDF Downloads 23223553 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR
Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.
Abstract:
We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME
Procedia PDF Downloads 39623552 Collapse Analysis of Planar Composite Frame under Impact Loads
Authors: Lian Song, Shao-Bo Kang, Bo Yang
Abstract:
Concrete filled steel tubular (CFST) structure has been widely used in construction practices due to its superior performances under various loading conditions. However, limited studies are available when this type of structure is subjected to impact or explosive loads. Current methods in relevant design codes are not specific for preventing progressive collapse of CFST structures. Therefore, it is necessary to carry out numerical simulations on CFST structure under impact loads. In this study, finite element analyses are conducted on the mechanical behaviour of composite frames which composed of CFST columns and steel beams subject to impact loading. In the model, CFST columns are simulated using finite element software ABAQUS. The model is verified by test results of solid and hollow CFST columns under lateral impacts, and reasonably good agreement is obtained through comparisons. Thereafter, a multi-scale finite element modelling technique is developed to evaluate the behaviour of a five-storey three-span planar composite frame. Alternate path method and direct simulation method are adopted to perform the dynamic response of the frame when a supporting column is removed suddenly. In the former method, the reason for column removal is not considered and only the remaining frame is simulated, whereas in the latter, a specific impact load is applied to the frame to take account of the column failure induced by vehicle impact. Comparisons are made between these two methods in terms of displacement history and internal force redistribution, and design recommendations are provided for the design of CFST structures under impact loads.Keywords: planar composite frame, collapse analysis, impact loading, direct simulation method, alternate path method
Procedia PDF Downloads 51923551 The Use of a Geographical Information System in the Field of Irrigation (Moyen-Chéliff)
Authors: Benhenni Abdellaziz
Abstract:
Irrigation is a limiting factor for agricultural production and socioeconomic development of many countries in the arid and semi-arid world. However, the sustainability of irrigation systems requires rational management of the water resource, which is becoming increasingly rare in these regions. The objective of this work is to apply a geographic information system (GIS) coupled with a model for calculating crop water requirements (CROPWATER) for the management of irrigation water in irrigated areas and offer managers an effective tool to better manage water resources in these areas. The application area of GIS is the irrigated perimeter of Western Middle Cheliff, which is located in a semi-arid region (Middle Cheliff). The scope in question is considerable agrarian dynamics and an increased need for irrigation of most crops.Keywords: GIS, CROPWAT, irrigation, water management, middle cheliff
Procedia PDF Downloads 7023550 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia
Authors: Tim Nedyalkov
Abstract:
A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. They are collecting, managing, and retaining large amounts of data in cloud environments makes information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.Keywords: cloud compliance, cloud security, data governance, privacy protection
Procedia PDF Downloads 11623549 Heteromolecular Structure Formation in Aqueous Solutions of Ethanol, Tetrahydrofuran and Dimethylformamide
Authors: Sh. Gofurov, O. Ismailova, U. Makhmanov, A. Kokhkharov
Abstract:
The refractometric method has been used to determine optical properties of concentration features of aqueous solutions of ethanol, tetrahydrofuran and dimethylformamide at the room temperature. Changes in dielectric permittivity of aqueous solutions of ethanol, tetrahydrofuran and dimethylformamide in a wide range of concentrations (0÷1.0 molar fraction) have been studied using molecular dynamics method. The curves depending on the concentration of experimental data on excess refractive indices and excess dielectric permittivity were compared. It has been shown that stable heteromolecular complexes in binary solutions are formed in the concentration range of 0.3÷0.4 mole fractions. The real and complex part of dielectric permittivity was obtained from dipole-dipole autocorrelation functions of molecules. At the concentrations of C = 0.3 / 0.4 m.f. the heteromolecular structures with hydrogen bonds are formed. This is confirmed by the extremum values of excessive dielectric permittivity and excessive refractive index of aqueous solutions.Keywords: refractometric method, aqueous solution, molecular dynamics, dielectric constant
Procedia PDF Downloads 26223548 Invistigation of Surface Properties of Nanostructured Carbon Films
Authors: Narek Margaryan, Zhozef Panosyan
Abstract:
Due to their unique properties, carbon nanofilms have become the object of general attention and intensive research. In this case it plays a very important role to study surface properties of these films. It is also important to study processes of forming of this films, which is accompanied by a process of self-organization at the nano and micro levels. For more detailed investigation, we examined diamond-like carbon (DLC) layers deposited by chemical vapor deposition (CVD) method on Ge substrate and hydro-generated grapheme layers obtained on surface of colloidal solution using grouping method. In this report surface transformation of these CVD nanolayers is studied by atomic force microscopy (AFM) upon deposition time. Also, it can be successfully used to study surface properties of self-assembled grapheme layers. In turn, it is possible to sketch out their boundary line, which enables one to draw an idea of peculiarities of formation of these layers. Images obtained by AFM are investigated as a mathematical set of numbers and fractal and roughness analysis were done. Fractal dimension, Regne’s fractal coefficient, histogram, Fast Fourier transformation, etc. were obtained. The dependence of fractal parameters on the deposition duration for CVD films and on temperature of solution tribolayers was revealed. As an important surface parameter for our carbon films, surface energy was calculated as function of Regne’s fractal coefficient. Surface potential was also measured with Kelvin probe method using semi-contacting AFM. The dependence of surface potential on the deposition duration for CVD films and on temperature of solution for hydro-generated graphene was found as well. Results obtained by fractal analysis method was related with purly esperimental results for number of samples.Keywords: nanostructured films, self-assembled grapheme, diamond-like carbon, surface potential, Kelvin probe method, fractal analysis
Procedia PDF Downloads 26823547 A Literature Review on Bladder Management in Individuals with Spinal Cord Injury
Authors: Elif Ates, Naile Bilgili
Abstract:
Background: One of the most important medical complications that individuals with spinal cord injury (SCI) face are the neurogenic bladder. Objectives: To review methods used for management of neurogenic bladder and their effects. Methods: The study was conducted by searching CINAHL, Ebscohost, MEDLINE, Science Direct, Ovid, ProQuest, Web of Science, and ULAKBİM National Databases for studies published between 2005 and 2015. Key words used during the search included ‘spinal cord injury’, ‘bladder injury’, ‘nursing care’, ‘catheterization’ and ‘intermittent urinary catheter’. After examination of 551 studies, 21 studies which met inclusion criteria were included in the review. Results: Mean age of individuals in all study samples was 42 years. The most commonly used bladder management method was clean intermittent catheterization (CIC). Compliance with CIC was found to be significantly related to spasticity, maximum cystometric capacity, and the person performing catheterization (p < .05). The main reason for changing the existing bladder management method was urinary tract infections (UTI). Individuals who performed CIC by themselves and who voided spontaneously had better life quality. Patient age, occupation status and whether they performed CIC by themselves or not were found to be significantly associated with depression level (p ≤ .05). Conclusion: As the most commonly used method for bladder management, CIC is a reliable and effective method, and reduces the risk of UTI development. Individuals with neurogenic bladder have a higher prevalence of depression symptoms than the normal population.Keywords: bladder management, catheterization, nursing, spinal cord injury
Procedia PDF Downloads 17423546 Performance Analysis of BLDC Motors for Flywheel Energy Storage Applications with Nonmagnetic vs. Magnetic Core Stator using Finite Element Time Stepping Method
Authors: Alok Kumar Pasa, Krs Raghavan
Abstract:
This paper presents a comparative analysis of Brushless DC (BLDC) motors for flywheel applications with a focus on the choice of stator core materials. The study employs a Finite Element Method (FEM) in time domain to investigate the performance characteristics of BLDC motors equipped with nonmagnetic and magnetic type stator core materials. Preliminary results reveal significant differences in motor efficiency, torque production, and electromagnetic properties between the two configurations. This research sheds light on the advantages of utilizing nonmagnetic materials in BLDC motors for flywheel applications, offering potential advantages in terms of efficiency, weight reduction and cost-effectiveness.Keywords: finite element time stepping method, high-speed BLDC motor, flywheel energy storage system, coreless BLDC motors
Procedia PDF Downloads 423545 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit
Authors: Ahmed Elrewainy
Abstract:
Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets
Procedia PDF Downloads 19523544 Creation of GaxCo1-xZnSe0.4 (x = 0.1, 0.3, 0.5) Nanoparticles Using Pulse Laser Ablation Method
Authors: Yong Pan, Li Wang, Xue Qiong Su, Dong Wen Gao
Abstract:
To date, nanomaterials have received extensive attention over the years because of their wide application. Various nanomaterials such as nanoparticles, nanowire, nanoring, nanostars and other nanostructures have begun to be systematically studied. The preparation of these materials by chemical methods is not only costly, but also has a long cycle and high toxicity. At the same time, preparation of nanoparticles of multi-doped composites has been limited due to the special structure of the materials. In order to prepare multi-doped composites with the same structure as macro-materials and simplify the preparation method, the GaxCo1-xZnSe0.4 (x = 0.1, 0.3, 0.5) nanoparticles are prepared by Pulse Laser Ablation (PLA) method. The particle component and structure are systematically investigated by X-ray diffraction (XRD) and Raman spectra, which show that the success of our preparation and the same concentration between nanoparticles (NPs) and target. Morphology of the NPs characterized by Transmission Electron Microscopy (TEM) indicates the circular-shaped particles in preparation. Fluorescence properties are reflected by PL spectra, which demonstrate the best performance in concentration of Ga0.3Co0.3ZnSe0.4. Therefore, all the results suggest that PLA is promising to prepare the multi-NPs since it can modulate performance of NPs.Keywords: PLA, physics, nanoparticles, multi-doped
Procedia PDF Downloads 17023543 Golden Brain Theory (GBT) for Language Learning
Authors: Tapas Karmaker
Abstract:
Centuries ago, we came to know about ‘Golden Ratio’ also known as Golden Angle. The idea of this research is based on this theme. Researcher perceives ‘The Golden Ratio’ in terms of harmony, meaning that every single item in the universe follows a harmonic behavior. In case of human being, brain responses easily and quickly to this harmony to help memorization. In this theory, harmony means a link. This study has been carried out on a segment of school students and a segment of common people for a period of three years from 2003 to 2006. The research in this respect intended to determine the impact of harmony in the brain of these people. It has been found that students and common people can increase their memorization capacity as much as 70 times more by applying this method. This method works faster and better between age of 8 and 30 years. This result was achieved through tests to assess memorizing capacity by using tools like words, rhymes, texts, math and drawings. The research concludes that this harmonic method can be applied for improving the capacity of learning languages, for the better quality of lifestyle, or any other terms of life as well as in professional activity.Keywords: language, education, golden brain, learning, teaching
Procedia PDF Downloads 20023542 Analysing Tertiary Lecturers’ Teaching Practices and Their English Major Students’ Learning Practices with Information and Communication Technology (ICT) Utilization in Promoting Higher-Order Thinking Skills (HOTs)
Authors: Malini Ganapathy, Sarjit Kaur
Abstract:
Maximising learning with higher-order thinking skills with Information and Communications Technology (ICT) has been deep-rooted and emphasised in various developed countries such as the United Kingdom, the United States of America and Singapore. The transformation of the education curriculum in the Malaysia Education Development Plan (PPPM) 2013-2025 focuses on the concept of Higher Order Thinking (HOT) skills which aim to produce knowledgeable students who are critical and creative in their thinking and can compete at the international level. HOT skills encourage students to apply, analyse, evaluate and think creatively in and outside the classroom. In this regard, the National Education Blueprint (2013-2025) is grounded based on high-performing systems which promote a transformation of the Malaysian education system in line with the vision of Malaysia’s National Philosophy in achieving educational outcomes which are of world class status. This study was designed to investigate ESL students’ learning practices on the emphasis of promoting HOTs while using ICT in their curricula. Data were collected using a stratified random sampling where 100 participants were selected to take part in the study. These respondents were a group of undergraduate students who undertook ESL courses in a public university in Malaysia. A three-part questionnaire consisting of demographic information, students’ learning experience and ICT utilization practices was administered in the data collection process. Findings from this study provide several important insights on students’ learning experiences and ICT utilization in developing HOT skills.Keywords: English as a second language students, critical and creative thinking, learning, information and communication technology and higher order thinking skills
Procedia PDF Downloads 49023541 An Application-Driven Procedure for Optimal Signal Digitization of Automotive-Grade Ultrasonic Sensors
Authors: Mohamed Shawki Elamir, Heinrich Gotzig, Raoul Zoellner, Patrick Maeder
Abstract:
In this work, a methodology is presented for identifying the optimal digitization parameters for the analog signal of ultrasonic sensors. These digitization parameters are the resolution of the analog to digital conversion and the sampling rate. This is accomplished through the derivation of characteristic curves based on Fano inequality and the calculation of the mutual information content over a given dataset. The mutual information is calculated between the examples in the dataset and the corresponding variation in the feature that needs to be estimated. The optimal parameters are identified in a manner that ensures optimal estimation performance while preventing inefficiency in using unnecessarily powerful analog to digital converters.Keywords: analog to digital conversion, digitization, sampling rate, ultrasonic
Procedia PDF Downloads 20723540 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference
Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov
Abstract:
Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.Keywords: canonical correlation analysis, power curve, power performance, wind energy
Procedia PDF Downloads 33623539 A Variant of a Double Structure-Preserving QR Algorithm for Symmetric and Hamiltonian Matrices
Authors: Ahmed Salam, Haithem Benkahla
Abstract:
Recently, an efficient backward-stable algorithm for computing eigenvalues and vectors of a symmetric and Hamiltonian matrix has been proposed. The method preserves the symmetric and Hamiltonian structures of the original matrix, during the whole process. In this paper, we revisit the method. We derive a way for implementing the reduction of the matrix to the appropriate condensed form. Then, we construct a novel version of the implicit QR-algorithm for computing the eigenvalues and vectors.Keywords: block implicit QR algorithm, preservation of a double structure, QR algorithm, symmetric and Hamiltonian structures
Procedia PDF Downloads 40923538 Citation Analysis on the Articles published in Bayero Journal of Pure and Applied Sciences (BAJOPAS), from 2008-2020: An International Journal in Bayero University, Kano, Nigeria
Authors: G. A. Babalola, Yusuf Muhammad
Abstract:
An analysis was carried out on 19,759 citations appended to the References Section of 881 research articles published in Bayero Journal of Pure and Applied Sciences. It was found that journals publications were the most cited source of information among pure and applied sciences scientists with 12,090 (61.2%). The study also revealed that researchers in the field of pure and applied sciences used very current and up to date information sources in writing theirs articles with 10,091 (51.1%) citations and an average mean 11.1 per article in the journal.Keywords: citation analysis, BAJOPAS, journal article, Bayero University Kano, Nigeria
Procedia PDF Downloads 16523537 Setting Control Limits For Inaccurate Measurements
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: quality control, process control, round-off, measurement, rounding error
Procedia PDF Downloads 9923536 Flood Simulation and Forecasting for Sustainable Planning of Response in Municipalities
Authors: Mariana Damova, Stanko Stankov, Emil Stoyanov, Hristo Hristov, Hermand Pessek, Plamen Chernev
Abstract:
We will present one of the first use cases on the DestinE platform, a joint initiative of the European Commission, European Space Agency and EUMETSAT, providing access to global earth observation, meteorological and statistical data, and emphasize the good practice of intergovernmental agencies acting in concert. Further, we will discuss the importance of space-bound disruptive solutions for improving the balance between the ever-increasing water-related disasters coming from climate change and minimizing their economic and societal impact. The use case focuses on forecasting floods and estimating the impact of flood events on the urban environment and the ecosystems in the affected areas with the purpose of helping municipal decision-makers to analyze and plan resource needs and to forge human-environment relationships by providing farmers with insightful information for improving their agricultural productivity. For the forecast, we will adopt an EO4AI method of our platform ISME-HYDRO, in which we employ a pipeline of neural networks applied to in-situ measurements and satellite data of meteorological factors influencing the hydrological and hydrodynamic status of rivers and dams, such as precipitations, soil moisture, vegetation index, snow cover to model flood events and their span. ISME-HYDRO platform is an e-infrastructure for water resources management based on linked data, extended with further intelligence that generates forecasts with the method described above, throws alerts, formulates queries, provides superior interactivity and drives communication with the users. It provides synchronized visualization of table views, graphviews and interactive maps. It will be federated with the DestinE platform.Keywords: flood simulation, AI, Earth observation, e-Infrastructure, flood forecasting, flood areas localization, response planning, resource estimation
Procedia PDF Downloads 2123535 Improvement in Safety Profile of Semecarpus Anacardium Linn by Shodhana: An Ayurvedic Purification Method
Authors: Umang H. Gajjar, K. M. Khambholja, R. K. Patel
Abstract:
Semecarpus anacardium shows the presence of bioflavonoids, phenolic compounds, bhilawanols, minerals, vitamins and amino acids. Detoxified S. anacardium and its oils are considered to have anti-inflammatory properties and used in nervous debility, neuritis, rheumatism and leprous modules. S. anacardium if used without purification causes toxic skin inflammation problem because it contains toxic phenolic oil. During this Shodhana Process - An ayurvedic purification method, toxic phenolic oil was removed, have marked effect on the concentration of the phytoconstituent & antioxidant activity of S. anacardium. Total phenolic content decreased up to 70 % (from 28.9 %w/w to 8.94 %w/w), while there is a negligible effect on the concentration of total flavonoid (7.51 %w/w to 7.43 %w/w) and total carbohydrate (0.907 %w/w to 0.853 % w/w) content. IC50& EC50 value of extract of S. anacardium before and after purification are 171.7 & 314.3 while EC50values are 280.μg/ml & 304. μg/ml, shows that antioxidant activity of S. anacardium is decreased but the safety profile of the drug is increased as the toxic phenolic oil was removed during Shodhana - An ayurvedic purification method.Keywords: Semecarpus anacardium, Shodhana process, safety profile, improvement
Procedia PDF Downloads 25723534 Detecting of Crime Hot Spots for Crime Mapping
Authors: Somayeh Nezami
Abstract:
The management of financial and human resources of police in metropolitans requires many information and exact plans to reduce a rate of crime and increase the safety of the society. Geographical Information Systems have an important role in providing crime maps and their analysis. By using them and identification of crime hot spots along with spatial presentation of the results, it is possible to allocate optimum resources while presenting effective methods for decision making and preventive solutions. In this paper, we try to explain and compare between some of the methods of hot spots analysis such as Mode, Fuzzy Mode and Nearest Neighbour Hierarchical spatial clustering (NNH). Then the spots with the highest crime rates of drug smuggling for one province in Iran with borderline with Afghanistan are obtained. We will show that among these three methods NNH leads to the best result.Keywords: GIS, Hot spots, nearest neighbor hierarchical spatial clustering, NNH, spatial analysis of crime
Procedia PDF Downloads 32923533 The Relation between Learning Styles and English Achievement in the Language Training Centre
Authors: Nurul Yusnita
Abstract:
Many studies have been developed to help the students to get good achievement in English learning. They can be from the teaching method or psychological ones. One of the psychological studies in educational research is learning style. In some ways, learning style can affect the achievement of the students. This study aimed to examine 4 (four) learning styles and their relations to English achievement among the students learning English in Language Training Center of Universitas Muhammadiyah Yogyakarta (LTC UMY). The method of this study was descriptive analytical. The sample consisted of 39 Accounting students in LTC UMY. The data was collected through questionnaires with Likert-scale. The achievement was obtained from the grade of the students. To analyze the questionnaires and to see the relation between the learning styles and the student achievement, SPSS statistical software of correlational analysis was used. The result showed that both visual and auditory had the same percentage of 35.9% (14 students). 3 students (7.7%) had kinaesthetic learning style and 8 students (20.5%) had visual and auditory ones. Meanwhile, there were 5 students (12.8%) who had visual learning style could increase their grades. Only 1 student (2.5%) who had visual and auditory could improve his grade. Besides grade increase, there were also grade decrease. Students with visual, auditory, visual and auditory, and kinaesthetic learning styles were 3 students (7.7%), 5 students (12%), 4 students (10.2%) and 1 student (2.5%) respectively. In conclusion, there was no significant relationship between learning style and English achievement. Most of the good achievers were the students with visual and auditory learning styles and most of them preferred visual method. The implication is the teachers and material designers could improve their method through visual things to achieve effective English teaching learning.Keywords: accounting students, English achievement, language training centre, learning styles
Procedia PDF Downloads 27123532 Developing a Quality Mentor Program: Creating Positive Change for Students in Enabling Programs
Authors: Bianca Price, Jennifer Stokes
Abstract:
Academic and social support systems are critical for students in enabling education; these support systems have the potential to enhance the student experience whilst also serving a vital role for student retention. In the context of international moves toward widening university participation, Australia has developed enabling programs designed to support underrepresented students to access to higher education. The purpose of this study is to examine the effectiveness of a mentor program based within an enabling course. This study evaluates how the mentor program supports new students to develop social networks, improve retention, and increase satisfaction with the student experience. Guided by Social Learning Theory (SLT), this study highlights the benefits that can be achieved when students engage in peer-to-peer based mentoring for both social and learning support. Whilst traditional peer mentoring programs are heavily based on face-to-face contact, the present study explores the difference between mentors who provide face-to-face mentoring, in comparison with mentoring that takes place through the virtual space, specifically via a virtual community in the shape of a Facebook group. This paper explores the differences between these two methods of mentoring within an enabling program. The first method involves traditional face-to-face mentoring that is provided by alumni students who willingly return to the learning community to provide social support and guidance for new students. The second method requires alumni mentor students to voluntarily join a Facebook group that is specifically designed for enabling students. Using this virtual space, alumni students provide advice, support and social commentary on how to be successful within an enabling program. Whilst vastly different methods, both of these mentoring approaches provide students with the support tools needed to enhance their student experience and improve transition into University. To evaluate the impact of each mode, this study uses mixed methods including a focus group with mentors, in-depth interviews, as well as engaging in netnography of the Facebook group ‘Wall’. Netnography is an innovative qualitative research method used to interpret information that is available online to better understand and identify the needs and influences that affect the users of the online space. Through examining the data, this research will reflect upon best practice for engaging students in enabling programs. Findings support the applicability of having both face-to-face and online mentoring available for students to assist enabling students to make a positive transition into University undergraduate studies.Keywords: enabling education, mentoring, netnography, social learning theory
Procedia PDF Downloads 121