Search results for: air data system
33465 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit
Procedia PDF Downloads 54433464 Epileptic Seizures in Patients with Multiple Sclerosis
Authors: Anat Achiron
Abstract:
Background: Multiple sclerosis (MS) is a chronic autoimmune disease that affects the central nervous system in young adults. It involves the immune system attacking the protective covering of nerve fibers (myelin), leading to inflammation and damage. MS can result in various neurological symptoms, such as muscle weakness, coordination problems, and sensory disturbances. Seizures are not common in MS, and the frequency is estimated between 0.4 to 6.4% over the disease course. Objective: Investigate the frequency of seizures in individuals with multiple sclerosis and to identify associated risk factors. Methods: We evaluated the frequency of seizures in a large cohort of 5686 MS patients followed at the Sheba Multiple Sclerosis Center and studied associated risk factors and comorbidities. Our research was based on data collection using a cohort study design. We applied logistic regression analysis to assess the strength of associations. Results: We found that younger age at onset, longer disease duration, and prolonged time to immunomodulatory treatment initiation were associated with increased risk for seizures. Conclusions: Our findings suggest that seizures in people with MS are directly related to the demyelination process and not associated with other factors like medication side effects or comorbid conditions. Therefore, initiating immunomodulatory treatment early in the disease course could reduce not only disease activity but also decrease seizure risk.Keywords: epilepsy, seizures, multiple sclerosis, white matter, age
Procedia PDF Downloads 7133463 Monotone Rational Trigonometric Interpolation
Authors: Uzma Bashir, Jamaludin Md. Ali
Abstract:
This study is concerned with the visualization of monotone data using a piece-wise C1 rational trigonometric interpolating scheme. Four positive shape parameters are incorporated in the structure of rational trigonometric spline. Conditions on two of these parameters are derived to attain the monotonicity of monotone data and other two are left-free. Figures are used widely to exhibit that the proposed scheme produces graphically smooth monotone curves.Keywords: trigonometric splines, monotone data, shape preserving, C1 monotone interpolant
Procedia PDF Downloads 27133462 A Study of the Interactions between the Inter-City Traffic System and the Spatial Structure Evolution in the Yangtze River Delta from Time and Space Dimensions
Authors: Zhang Cong, Cai Runlin, Jia Fengjiao
Abstract:
The evolution of the urban agglomeration spatial structure requires strong support of the inter-city traffic system. And the inter-city traffic system can not only meet the demand of the urban agglomeration transportation but also guide the economic development. To correctly understand the relationship between inter-city traffic planning and urban agglomeration can help the urban agglomeration coordinated developing with the inter-city traffic system. The Yangtze River Delta is one of the most representative urban agglomerations in China with strong economic vitality, high city levels, diversified urban space form, and improved transport infrastructure. With the promotion of industrial division in the Yangtze River Delta and the regional travel facilitation brought by inter-city traffic, the urban agglomeration is characterized by highly increasing of inter-city transportation demand, the urbanization of regional traffic, adjacent regional transportation links breaking administrative boundaries, the networked channels and so on. Therefore, the development of inter-city traffic system presents new trends and challenges. This paper studies the interactions between inter-city traffic system and regional economic growth, regional factor flow, and regional spatial structure evolution in the Yangtze River Delta from two dimensions of time and space. On this basis, the adaptability of inter-city traffic development mode and urban agglomeration space structure is analyzed. First of all, the coordination between urban agglomeration planning and inter-city traffic planning is judged from the planning level. Secondly, the coordination between inter-city traffic elements and industries and population distributions is judged from the perspective of space. Finally, the coordination of the cross-regional planning and construction of inter-city traffic system is judged. The conclusions can provide an empirical reference for intercity traffic planning in Yangtze River Delta region and other urban agglomerations, and it is also of great significance to optimize the allocation of urban agglomerations and the overall operational efficiency.Keywords: evolution, interaction, inter-city traffic system, spatial structure
Procedia PDF Downloads 31133461 Improving Power Quality in Wind Power Generation System
Authors: A. Omeiri, A. Djellad, P. O. Logerais, O. Riou, J. F. Durastanti
Abstract:
With the growing of electrical energy demand, wind power capacity has experienced tremendous growth in the past decade, thanks to wind power’s environmental benefits. Direct driven permanent magnet synchronous generator (PMSG) with a full size back-to-back converter set is one of the promising technologies employed with wind power generation. Wind grid integration brings the problems of voltage fluctuation and harmonic pollution. In the present study, the filter is placed between the wind system and the network to reduce the total harmonic distortion (THD) and enhance power quality during disturbances. The models of wind turbine, PMSG, power electronic converters and the filter are implemented in MATLAB/SIMULINK environment.Keywords: wind energy conversion system, PMSG, PWM, THD, power quality, passive filter
Procedia PDF Downloads 64833460 Building Information Modelling-Based Diminished Reality Visualisation to Facilitate Building Renovation Projects
Authors: Roghieh Eskandari, Ali Motamedi
Abstract:
There is a significant demand for renovation as-built assets are aging. To plan for a desirable and comfortable indoor environment, stakeholders use simulation technics to assess potential renovation scenarios with the innovative designs. Diminished Reality (DR), which is a technique of visually removing unwanted objects from the real-world scene in real-time, can contribute to the renovation design visualization for stakeholders by removing existing structures and assets from the scene. Using DR, the objects to be demolished or changed will be visually removed from the scene for a better understanding of the intended design scenarios for stakeholders. This research proposes an integrated system for renovation plan visualization using Building Information Modelling (BIM) data and mixed reality (MR) technologies. It presents a BIM-based DR method that utilizes a textured BIM model of the environment to accurately register the virtual model of the occluded background to the physical world in real-time. This system can facilitate the simulation of the renovation plan by visually diminishing building elements in an indoor environment.Keywords: diminished reality, building information modelling, mixed reality, stock renovation
Procedia PDF Downloads 11433459 Optimal Protection Coordination in Distribution Systems with Distributed Generations
Authors: Abdorreza Rabiee, Shahla Mohammad Hoseini Mirzaei
Abstract:
The advantages of distributed generations (DGs) based on renewable energy sources (RESs) leads to high penetration level of DGs in distribution network. With incorporation of DGs in distribution systems, the system reliability and security, as well as voltage profile, is improved. However, the protection of such systems is still challenging. In this paper, at first, the related papers are reviewed and then a practical scheme is proposed for coordination of OCRs in distribution system with DGs. The coordination problem is formulated as a nonlinear programming (NLP) optimization problem with the object function of minimizing total operating time of OCRs. The proposed method is studied based on a simple test system. The optimization problem is solved by General Algebraic Modeling System (GAMS) to calculate the optimal time dial setting (TDS) and also pickup current setting of OCRs. The results show the effectiveness of the proposed method and its applicability.Keywords: distributed generation, DG, distribution network, over current relay, OCR, protection coordination, pickup current, time dial setting, TDS
Procedia PDF Downloads 13833458 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels
Authors: Joshua Buli, David Pietrowski, Samuel Britton
Abstract:
Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization
Procedia PDF Downloads 8633457 The Influence of Atmospheric Air on the Health of the Population Living in Oil and Gas Production Area in Aktobe Region, Kazakhstan
Authors: Perizat Aitmaganbet, Kerbez Kimatova, Gulmira Umarova
Abstract:
As a result of medical check-up conducted in the framework of this research study an evaluation of the health status of the population living in the oil-producing regions, namely Sarkul and Kenkiyak villages in Aktobe was examined. With the help of the Spearman correlation, the connection between the level of hazard chemical elements in the atmosphere and the health of population living in the regions of oil and gas industry was estimated. Background & Objective. The oil and gas resource-extraction industries play an important role in improving the economic conditions of the Republic of Kazakhstan, especially for the oil-producing administrative regions. However, environmental problems may adversely affect the health of people living in that area. Thus, the aim of the study is to evaluate the exposure to negative environmental factors of the adult population living in Sarkul and Kenkiyak villages, the oil and gas producing areas in the Aktobe region. Methods. After conducting medical check-up among the population of Sarkul and Kenkiyak villages. A single cross-sectional study was conducted. The population consisted of randomly sampled 372 adults (181 males and 191 females). Also, atmospheric air probes were taken to measure the level of hazardous chemical elements in the air. The nonparametric method of the Spearman correlation analysis was performed between the mean concentration of substances exceeding the Maximum Permissible Concentration and the classes of newly diagnosed diseases. Selection and analysis of air samples were carried out according to the developed research protocol; the qualitative-quantitative analysis was carried out on the Gas analyzer HANK-4 apparatus. Findings. The medical examination of the population identified the following diseases: the first two dominant were diseases of the circulatory and digestive systems, in the 3rd place - diseases of the genitourinary system, and the nervous system and diseases of the ear and mastoid process were on the fourth and fifth places. Moreover, significant pollution of atmospheric air by carbon monoxide (MPC-5,0 mg/m3), benzapyrene (MPC-1mg/m3), dust (MPC-0,5 mg/m3) and phenol (МРС-0,035mg/m3) were identified in places. Correlation dependencies between these pollutants of air and the diseases of the population were established, as a result of diseases of the circulatory system (r = 0,7), ear and mastoid process (r = 0,7), nervous system (r = 0,6) and digestive organs(r = 0,6 ); between the concentration of carbon monoxide and diseases of the circulatory system (r = 0.6), the digestive system(r = 0.6), the genitourinary system (r = 0.6) and the musculoskeletal system; between nitric oxide and diseases of the digestive system (r = 0,7) and the circulatory system (r = 0,6); between benzopyrene and diseases of the digestive system (r = 0,6), the genitourinary system (r = 0,6) and the nervous system (r = 0,4). Conclusion. The positive correlation was found between air pollution and the health of the population living in Sarkul and Kenkiyak villages. To enhance the reliability of the results we are going to continue this study further.Keywords: atmospheric air, chemical substances, oil and gas, public health
Procedia PDF Downloads 11433456 Instructional Information Resources
Authors: Parveen Kumar
Abstract:
This article discusses institute information resources. Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as message information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system. Conceptually, information is the message being conveyed. This concept has numerous other meanings in different contexts. Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, representation, and especially entropy.Keywords: institutions, information institutions, information services for mission-oriented institute, pattern
Procedia PDF Downloads 37633455 The Evaluation of a Mobile Proximity Payment Application through Its Legitimacy and Social Acceptability
Authors: Intissar Abbes, Yousra Hallem, Jean-michel Sahut
Abstract:
The purpose of this research is to explore the legitimacy of a proximity mobile payment (PMP) system by taking into account the social aspects related to its use (social acceptability). We have chosen to focus on the acceptability process of a PMP application (‘Flashplay’) from the first testing to the adoption in a service context. This PMP solution is a pilot program developed as part of a global strategy of disintermediation in various sectors (retail, catering, and entertainment). This case is particularly interesting for two reasons: the context and environment are suitable to the adoption of innovation in payment like other African countries and the possibility to study different stages of the social acceptability process of that PMP system. The neo-institutional theory is mobilized to identify the three pillars of legitimacy: cognitive, normative and regulatory. A longitudinal qualitative study was conducted with 27 customers using the PMP service. Results highlighted the importance of the consumption system and the service provider as institutions. Recommendations are thus proposed to PMP service providers in order to rethink the design and implementation strategies of their PMP system to ensure their adoption and promote the institutionalization of this type of consumption practice.Keywords: legitimacy, payment, acceptability, mobility
Procedia PDF Downloads 18233454 The Attitude of Education College Students Towards Using the Web Portal of the Academic System
Authors: Ibrahim Alhumaidan
Abstract:
As King Saud University believes in the critical role played by technology and its effectiveness in achieving quality, speed of achievement, facilitating follow-up and enhancing responsibility undertaking; the university is keen on activating its e-services for the purpose of attaining the primary requirements of achievement and perfection. The web portal of the student's academic system comes as one of the most important practices in technological and e-transaction aspects. It enables students to carry out their processes–registration, addition, evaluation, viewing their results, and scholastic accomplishments, etc.– through the relevant web portal. The aim of this study is to recognize Education College students' attitude -as one of King's University Colleges- regarding the usage of the academic system web portal, its effectiveness in saving time and effort, and, efficiency in enhancing student's planning skills. The study society is all students of college of education in King Saud University and the sample has been chosen randomly from them. The study tool is a questionnaire designed to learn about students' views about using the web portal; as the researcher used the surveying methodology to achieve the aim of the study.Keywords: web portal, academic system, education faculty, students, planning skills
Procedia PDF Downloads 29033453 Impact Analysis of Quality Control Practices in Veterinary Diagnostic Labs in Lahore, Pakistan
Authors: Faiza Marrium, Masood Rabbani, Ali Ahmad Sheikh, Muhammad Yasin Tipu Javed Muhammad, Sohail Raza
Abstract:
More than 75% diseases spreading in the past 10 years in human population globally are linked to veterinary sector. Veterinary diagnostic labs are the powerful ally for diagnosis, prevention and monitoring of animal diseases in any country. In order to avoid detrimental effects of errors in disease diagnostic and biorisk management, there is a dire need to establish quality control system. In current study, 3 private and 6 public sectors veterinary diagnostic labs were selected for survey. A questionnaire survey in biorisk management guidelines of CWA 15793 was designed to find quality control breaches in lab design, personal, equipment and consumable, quality control measures adopted in lab, waste management, environmental monitoring and customer care. The data was analyzed through frequency distribution statistically by using (SPSS) version 18.0. A non-significant difference was found in all parameters of lab design, personal, equipment and consumable, quality control measures adopted in lab, waste management, environmental monitoring and customer care with an average percentage of 46.6, 57.77, 52.7, 55.5, 54.44, 48.88 and 60, respectively. A non-significant difference among all nine labs were found, with highest average compliance percentage of all parameters are lab 2 (78.13), Lab 3 (70.56), Lab 5 (57.51), Lab 6 (56.37), Lab 4 (55.02), Lab 9 (49.58), Lab 7 (47.76), Lab 1 (41.01) and Lab 8 (36.09). This study shows that in Lahore district veterinary diagnostic labs are not giving proper attention to quality of their system and there is no significant difference between setups of private and public sector laboratories. These results show that most of parameters are between 50 and 80 percent, which needs some work and improvement as per WHO criteria.Keywords: veterinary lab, quality management system, accreditation, regulatory body, disease identification
Procedia PDF Downloads 14633452 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data
Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy
Abstract:
This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.Keywords: data warehouse, description logics, integration, knowledge, metadata
Procedia PDF Downloads 13833451 A Statistical-Algorithmic Approach for the Design and Evaluation of a Fresnel Solar Concentrator-Receiver System
Authors: Hassan Qandil
Abstract:
Using a statistical algorithm incorporated in MATLAB, four types of non-imaging Fresnel lenses are designed; spot-flat, linear-flat, dome-shaped and semi-cylindrical-shaped. The optimization employs a statistical ray-tracing methodology of the incident light, mainly considering effects of chromatic aberration, varying focal lengths, solar inclination and azimuth angles, lens and receiver apertures, and the optimum number of prism grooves. While adopting an equal-groove-width assumption of the Poly-methyl-methacrylate (PMMA) prisms, the main target is to maximize the ray intensity on the receiver’s aperture and therefore achieving higher values of heat flux. The algorithm outputs prism angles and 2D sketches. 3D drawings are then generated via AutoCAD and linked to COMSOL Multiphysics software to simulate the lenses under solar ray conditions, which provides optical and thermal analysis at both the lens’ and the receiver’s apertures while setting conditions as per the Dallas-TX weather data. Once the lenses’ characterization is finalized, receivers are designed based on its optimized aperture size. Several cavity shapes; including triangular, arc-shaped and trapezoidal, are tested while coupled with a variety of receiver materials, working fluids, heat transfer mechanisms, and enclosure designs. A vacuum-reflective enclosure is also simulated for an enhanced thermal absorption efficiency. Each receiver type is simulated via COMSOL while coupled with the optimized lens. A lab-scale prototype for the optimum lens-receiver configuration is then fabricated for experimental evaluation. Application-based testing is also performed for the selected configuration, including that of a photovoltaic-thermal cogeneration system and solar furnace system. Finally, some future research work is pointed out, including the coupling of the collector-receiver system with an end-user power generator, and the use of a multi-layered genetic algorithm for comparative studies.Keywords: COMSOL, concentrator, energy, fresnel, optics, renewable, solar
Procedia PDF Downloads 15533450 Data Analytics in Energy Management
Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair
Abstract:
With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.Keywords: energy analytics, energy management, operational data, business intelligence, optimization
Procedia PDF Downloads 36433449 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data
Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query
Procedia PDF Downloads 16133448 Disaster Management Using Wireless Sensor Networks
Authors: Akila Murali, Prithika Manivel
Abstract:
Disasters are defined as a serious disruption of the functioning of a community or a society, which involves widespread human, material, economic or environmental impacts. The number of people suffering food crisis as a result of natural disasters has tripled in the last thirty years. The economic losses due to natural disasters have shown an increase with a factor of eight over the past four decades, caused by the increased vulnerability of the global society, and also due to an increase in the number of weather-related disasters. Efficient disaster detection and alerting systems could reduce the loss of life and properties. In the event of a disaster, another important issue is a good search and rescue system with high levels of precision, timeliness and safety for both the victims and the rescuers. Wireless Sensor Networks technology has the capability of quick capturing, processing, and transmission of critical data in real-time with high resolution. This paper studies the capacity of sensors and a Wireless Sensor Network to collect, collate and analyze valuable and worthwhile data, in an ordered manner to help with disaster management.Keywords: alerting systems, disaster detection, Ad Hoc network, WSN technology
Procedia PDF Downloads 40433447 Design of New Alloys from Al-Ti-Zn-Mg-Cu System by in situ Al3Ti Formation
Authors: Joao Paulo De Oliveira Paschoal, Andre Victor Rodrigues Dantas, Fernando Almeida Da Silva Fernandes, Eugenio Jose Zoqui
Abstract:
With the adoption of High Pressure Die Casting technologies for the production of automotive bodies by the famous Giga Castings, the technology of processing metal alloys in the semi-solid state (SSM) becomes interesting because it allows for higher product quality, such as lower porosity and shrinkage voids. However, the alloys currently processed are derived from the foundry industry and are based on the Al-Si-(Cu-Mg) system. High-strength alloys, such as those of the Al-Zn-Mg-Cu system, are not usually processed, but the benefits of using this system, which is susceptible to heat treatments, can be associated with the advantages obtained by processing in the semi-solid state, promoting new possibilities for production routes and improving product performance. The current work proposes a new range of alloys to be processed in the semi-solid state through the modification of aluminum alloys of the Al-Zn-Mg-Cu system by the in-situ formation of Al3Ti intermetallic. Such alloys presented the thermodynamic stability required for semi-solid processing, with a sensitivity below 0.03(Celsius degrees * -1), in a wide temperature range. Furthermore, these alloys presented high hardness after aging heat treatment, reaching 190HV. Therefore, they are excellent candidates for the manufacture of parts that require low levels of defects and high mechanical strength.Keywords: aluminum alloys, semisolid metals processing, intermetallics, heat treatment, titanium aluminide
Procedia PDF Downloads 1333446 The Extent of Big Data Analysis by the External Auditors
Authors: Iyad Ismail, Fathilatul Abdul Hamid
Abstract:
This research was mainly investigated to recognize the extent of big data analysis by external auditors. This paper adopts grounded theory as a framework for conducting a series of semi-structured interviews with eighteen external auditors. The research findings comprised the availability extent of big data and big data analysis usage by the external auditors in Palestine, Gaza Strip. Considering the study's outcomes leads to a series of auditing procedures in order to improve the external auditing techniques, which leads to high-quality audit process. Also, this research is crucial for auditing firms by giving an insight into the mechanisms of auditing firms to identify the most important strategies that help in achieving competitive audit quality. These results are aims to instruct the auditing academic and professional institutions in developing techniques for external auditors in order to the big data analysis. This paper provides appropriate information for the decision-making process and a source of future information which affects technological auditing.Keywords: big data analysis, external auditors, audit reliance, internal audit function
Procedia PDF Downloads 7033445 Magnetic Navigation in Underwater Networks
Authors: Kumar Divyendra
Abstract:
Underwater Sensor Networks (UWSNs) have wide applications in areas such as water quality monitoring, marine wildlife management etc. A typical UWSN system consists of a set of sensors deployed randomly underwater which communicate with each other using acoustic links. RF communication doesn't work underwater, and GPS too isn't available underwater. Additionally Automated Underwater Vehicles (AUVs) are deployed to collect data from some special nodes called Cluster Heads (CHs). These CHs aggregate data from their neighboring nodes and forward them to the AUVs using optical links when an AUV is in range. This helps reduce the number of hops covered by data packets and helps conserve energy. We consider the three-dimensional model of the UWSN. Nodes are initially deployed randomly underwater. They attach themselves to the surface using a rod and can only move upwards or downwards using a pump and bladder mechanism. We use graph theory concepts to maximize the coverage volume while every node maintaining connectivity with at least one surface node. We treat the surface nodes as landmarks and each node finds out its hop distance from every surface node. We treat these hop-distances as coordinates and use them for AUV navigation. An AUV intending to move closer to a node with given coordinates moves hop by hop through nodes that are closest to it in terms of these coordinates. In absence of GPS, multiple different approaches like Inertial Navigation System (INS), Doppler Velocity Log (DVL), computer vision-based navigation, etc., have been proposed. These systems have their own drawbacks. INS accumulates error with time, vision techniques require prior information about the environment. We propose a method that makes use of the earth's magnetic field values for navigation and combines it with other methods that simultaneously increase the coverage volume under the UWSN. The AUVs are fitted with magnetometers that measure the magnetic intensity (I), horizontal inclination (H), and Declination (D). The International Geomagnetic Reference Field (IGRF) is a mathematical model of the earth's magnetic field, which provides the field values for the geographical coordinateson earth. Researchers have developed an inverse deep learning model that takes the magnetic field values and predicts the location coordinates. We make use of this model within our work. We combine this with with the hop-by-hop movement described earlier so that the AUVs move in such a sequence that the deep learning predictor gets trained as quickly and precisely as possible We run simulations in MATLAB to prove the effectiveness of our model with respect to other methods described in the literature.Keywords: clustering, deep learning, network backbone, parallel computing
Procedia PDF Downloads 9833444 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases
Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar
Abstract:
Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning
Procedia PDF Downloads 11933443 Knowledge, Attitudes and Readiness of Students towards Higher Order Thinking Skills
Authors: Mohd Aderi Che Noh, Tuan Rahayu Tuan Lasan
Abstract:
Higher order thinking skills (HOTS) is an important skill in the Malaysian education system to produce a knowledgeable generation, able to think critically and creatively in order to face the challenges in the future. Educational challenges of the 21st century require that all students to have the HOTS. Therefore, this study aims to identify the level of knowledge, attitude and readiness of students towards HOTS. The respondents were 127 form four students from schools in the Federal Territory of Putrajaya. This study is quantitative survey using a questionnaire to collect data. Data were analyzed using Statistical Package for the Social Sciences (SPSS) 23.0. The results showed that knowledge, attitudes and readiness of students towards HOTS lam were at a high level. Inferential analysis showed that there was a significant relationship between knowledge with attitude and readiness towards HOTS. This study provides information to the schools and teachers to improve the teaching and learning to increase students HOTS and fulfilling the hope of Ministry of Education to produce human capital who can be globally competitive.Keywords: high order thinking skills, teaching, education, Malaysia
Procedia PDF Downloads 21233442 Optimized Deep Learning-Based Facial Emotion Recognition System
Authors: Erick C. Valverde, Wansu Lim
Abstract:
Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.Keywords: deep learning, face detection, facial emotion recognition, network optimization methods
Procedia PDF Downloads 11833441 Labview-Based System for Fiber Links Events Detection
Authors: Bo Liu, Qingshan Kong, Weiqing Huang
Abstract:
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising
Procedia PDF Downloads 12333440 Investigation of Complexity Dynamics in a DC Glow Discharge Magnetized Plasma Using Recurrence Quantification Analysis
Authors: Vramori Mitra, Bornali Sarma, Arun K. Sarma
Abstract:
Recurrence is a ubiquitous feature of any real dynamical system. The states in phase space trajectory of a system have an inherent tendency to return to the same state or its close state after certain time laps. Recurrence quantification analysis technique, based on this fundamental feature of a dynamical system, detects evaluation of state under variation of control parameter of the system. The paper presents the investigation of nonlinear dynamical behavior of plasma floating potential fluctuations obtained by using a Langmuir probe in different magnetic field under the variation of discharge voltages. The main measures of recurrence quantification analysis are considered as determinism, linemax and entropy. The increment of the DET and linemax variables asserts that the predictability and periodicity of the system is increasing. The variable linemax indicates that the chaoticity is being diminished with the slump of magnetic field while increase of magnetic field enhancing the chaotic behavior. Fractal property of the plasma time series estimated by DFA technique (Detrended fluctuation analysis) reflects that long-range correlation of plasma fluctuations is decreasing while fractal dimension is increasing with the enhancement of magnetic field which corroborates the RQA analysis.Keywords: detrended fluctuation analysis, chaos, phase space, recurrence
Procedia PDF Downloads 32833439 Silicon-Photonic-Sensor System for Botulinum Toxin Detection in Water
Authors: Binh T. T. Nguyen, Zhenyu Li, Eric Yap, Yi Zhang, Ai-Qun Liu
Abstract:
Silicon-photonic-sensor system is an emerging class of analytical technologies that use evanescent field wave to sensitively measure the slight difference in the surrounding environment. The wavelength shift induced by local refractive index change is used as an indicator in the system. These devices can be served as sensors for a wide variety of chemical or biomolecular detection in clinical and environmental fields. In our study, a system including a silicon-based micro-ring resonator, microfluidic channel, and optical processing is designed, fabricated for biomolecule detection. The system is demonstrated to detect Clostridium botulinum type A neurotoxin (BoNT) in different water sources. BoNT is one of the most toxic substances known and relatively easily obtained from a cultured bacteria source. The toxin is extremely lethal with LD50 of about 0.1µg/70kg intravenously, 1µg/ 70 kg by inhalation, and 70µg/kg orally. These factors make botulinum neurotoxins primary candidates as bioterrorism or biothreat agents. It is required to have a sensing system which can detect BoNT in a short time, high sensitive and automatic. For BoNT detection, silicon-based micro-ring resonator is modified with a linker for the immobilization of the anti-botulinum capture antibody. The enzymatic reaction is employed to increase the signal hence gains sensitivity. As a result, a detection limit to 30 pg/mL is achieved by our silicon-photonic sensor within a short period of 80 min. The sensor also shows high specificity versus the other type of botulinum. In the future, by designing the multifunctional waveguide array with fully automatic control system, it is simple to simultaneously detect multi-biomaterials at a low concentration within a short period. The system has a great potential to apply for online, real-time and high sensitivity for the label-free bimolecular rapid detection.Keywords: biotoxin, photonic, ring resonator, sensor
Procedia PDF Downloads 11733438 An Intelligent Thermal-Aware Task Scheduler in Multiprocessor System on a Chip
Authors: Sina Saadati
Abstract:
Multiprocessors Systems-On-Chips (MPSOCs) are used widely on modern computers to execute sophisticated software and applications. These systems include different processors for distinct aims. Most of the proposed task schedulers attempt to improve energy consumption. In some schedulers, the processor's temperature is considered to increase the system's reliability and performance. In this research, we have proposed a new method for thermal-aware task scheduling which is based on an artificial neural network (ANN). This method enables us to consider a variety of factors in the scheduling process. Some factors like ambient temperature, season (which is important for some embedded systems), speed of the processor, computing type of tasks and have a complex relationship with the final temperature of the system. This Issue can be solved using a machine learning algorithm. Another point is that our solution makes the system intelligent So that It can be adaptive. We have also shown that the computational complexity of the proposed method is cheap. As a consequence, It is also suitable for battery-powered systems.Keywords: task scheduling, MOSOC, artificial neural network, machine learning, architecture of computers, artificial intelligence
Procedia PDF Downloads 10333437 A Model of Teacher Leadership in History Instruction
Authors: Poramatdha Chutimant
Abstract:
The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership
Procedia PDF Downloads 28033436 Soft Computing Approach for Diagnosis of Lassa Fever
Authors: Roseline Oghogho Osaseri, Osaseri E. I.
Abstract:
Lassa fever is an epidemic hemorrhagic fever caused by the Lassa virus, an extremely virulent arena virus. This highly fatal disorder kills 10% to 50% of its victims, but those who survive its early stages usually recover and acquire immunity to secondary attacks. One of the major challenges in giving proper treatment is lack of fast and accurate diagnosis of the disease due to multiplicity of symptoms associated with the disease which could be similar to other clinical conditions and makes it difficult to diagnose early. This paper proposed an Adaptive Neuro Fuzzy Inference System (ANFIS) for the prediction of Lass Fever. In the design of the diagnostic system, four main attributes were considered as the input parameters and one output parameter for the system. The input parameters are Temperature on admission (TA), White Blood Count (WBC), Proteinuria (P) and Abdominal Pain (AP). Sixty-one percent of the datasets were used in training the system while fifty-nine used in testing. Experimental results from this study gave a reliable and accurate prediction of Lassa fever when compared with clinically confirmed cases. In this study, we have proposed Lassa fever diagnostic system to aid surgeons and medical healthcare practictionals in health care facilities who do not have ready access to Polymerase Chain Reaction (PCR) diagnosis to predict possible Lassa fever infection.Keywords: anfis, lassa fever, medical diagnosis, soft computing
Procedia PDF Downloads 269