Search results for: and model-based techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6754

Search results for: and model-based techniques

3994 Impact Assessment of Tropical Cyclone Hudhud on Visakhapatnam, Andhra Pradesh

Authors: Vivek Ganesh

Abstract:

Tropical cyclones are some of the most damaging events. They occur in yearly cycles and affect the coastal population with three dangerous effects: heavy rain, strong wind and storm surge. In order to estimate the area and the population affected by a cyclone, all the three types of physical impacts must be taken into account. Storm surge is an abnormal rise of water above the astronomical tides, generated by strong winds and drop in the atmospheric pressure. The main aim of the study is to identify the impact by comparing three different months data. The technique used here is NDVI classification technique for change detection and other techniques like storm surge modelling for finding the tide height. Current study emphasize on recent very severe cyclonic storm Hud Hud of category 3 hurricane which had developed on 8 October 2014 and hit the coast on 12 October 2014 which caused significant changes on land and coast of Visakhapatnam, Andhra Pradesh. In the present study, we have used Remote Sensing and GIS tools for investigating and quantifying the changes in vegetation and settlement.

Keywords: inundation map, NDVI map, storm tide map, track map

Procedia PDF Downloads 268
3993 Challenges and Prospects of Small and Medium Scale Enterprises in Somolu Local Government Area

Authors: A. A. Akharayi, B. E. Anjola

Abstract:

The economic development of a country depends greatly on internally built revenue. Small and Medium-scale Enterprise (SMEs) contributes to the economic buoyancy as it provides employment for the teeming population, encourages job creation by youths who believes in themselves and also by others who have gathered finance enough to invest in growable investment. SMEs is faced with several challenges. The study investigates the role and challenges of SMEs Somolu Local Government Area. Simple random sampling techniques were used to select entrepreneurs (SMEs owners and managers). One hundred and fifty (150) registered SMEs were selected across the LGA data collection with the use of well-structured questionnaire. The data collected were analysed using Statistical Package for Social Science (SPSS) version 21. The result of the analysis indicated that marketing, finance, social facilities and indiscriminate taxes among other high level of fund available significantly (p <0 .05) increase firm capacity while marketing showed a significant (p < 0.05) relationship with profit level.

Keywords: challenge, development, economic, small and medium scale enterprise

Procedia PDF Downloads 243
3992 Robust Medical Image Watermarking Using Frequency Domain and Least Significant Bits Algorithms

Authors: Volkan Kaya, Ersin Elbasi

Abstract:

Watermarking and stenography are getting importance recently because of copyright protection and authentication. In watermarking we embed stamp, logo, noise or image to multimedia elements such as image, video, audio, animation and text. There are several works have been done in watermarking for different purposes. In this research work, we used watermarking techniques to embed patient information into the medical magnetic resonance (MR) images. There are two methods have been used; frequency domain (Digital Wavelet Transform-DWT, Digital Cosine Transform-DCT, and Digital Fourier Transform-DFT) and spatial domain (Least Significant Bits-LSB) domain. Experimental results show that embedding in frequency domains resist against one type of attacks, and embedding in spatial domain is resist against another group of attacks. Peak Signal Noise Ratio (PSNR) and Similarity Ratio (SR) values are two measurement values for testing. These two values give very promising result for information hiding in medical MR images.

Keywords: watermarking, medical image, frequency domain, least significant bits, security

Procedia PDF Downloads 288
3991 Using Artificial Vision Techniques for Dust Detection on Photovoltaic Panels

Authors: Gustavo Funes, Eduardo Peters, Jose Delpiano

Abstract:

It is widely known that photovoltaic technology has been massively distributed over the last decade despite its low-efficiency ratio. Dust deposition reduces this efficiency even more, lowering the energy production and module lifespan. In this work, we developed an artificial vision algorithm based on CIELAB color space to identify dust over panels in an autonomous way. We performed several experiments photographing three different types of panels, 30W, 340W and 410W. Those panels were soiled artificially with uniform and non-uniform distributed dust. The algorithm proposed uses statistical tools to provide a simulation with a 100% soiled panel and then performs a comparison to get the percentage of dirt in the experimental data set. The simulation uses a seed that is obtained by taking a dust sample from the maximum amount of dust from the dataset. The final result is the dirt percentage and the possible distribution of dust over the panel. Dust deposition is a key factor for plant owners to determine cleaning cycles or identify nonuniform depositions that could lead to module failure and hot spots.

Keywords: dust detection, photovoltaic, artificial vision, soiling

Procedia PDF Downloads 50
3990 AI Tutor: A Computer Science Domain Knowledge Graph-Based QA System on JADE platform

Authors: Yingqi Cui, Changran Huang, Raymond Lee

Abstract:

In this paper, we proposed an AI Tutor using ontology and natural language process techniques to generate a computer science domain knowledge graph and answer users’ questions based on the knowledge graph. We define eight types of relation to extract relationships between entities according to the computer science domain text. The AI tutor is separated into two agents: learning agent and Question-Answer (QA) agent and developed on JADE (a multi-agent system) platform. The learning agent is responsible for reading text to extract information and generate a corresponding knowledge graph by defined patterns. The QA agent can understand the users’ questions and answer humans’ questions based on the knowledge graph generated by the learning agent.

Keywords: artificial intelligence, natural Language processing, knowledge graph, intelligent agents, QA system

Procedia PDF Downloads 187
3989 A Less Complexity Deep Learning Method for Drones Detection

Authors: Mohamad Kassab, Amal El Fallah Seghrouchni, Frederic Barbaresco, Raed Abu Zitar

Abstract:

Detecting objects such as drones is a challenging task as their relative size and maneuvering capabilities deceive machine learning models and cause them to misclassify drones as birds or other objects. In this work, we investigate applying several deep learning techniques to benchmark real data sets of flying drones. A deep learning paradigm is proposed for the purpose of mitigating the complexity of those systems. The proposed paradigm consists of a hybrid between the AdderNet deep learning paradigm and the Single Shot Detector (SSD) paradigm. The goal was to minimize multiplication operations numbers in the filtering layers within the proposed system and, hence, reduce complexity. Some standard machine learning technique, such as SVM, is also tested and compared to other deep learning systems. The data sets used for training and testing were either complete or filtered in order to remove the images with mall objects. The types of data were RGB or IR data. Comparisons were made between all these types, and conclusions were presented.

Keywords: drones detection, deep learning, birds versus drones, precision of detection, AdderNet

Procedia PDF Downloads 182
3988 Building 1-Well-Covered Graphs by Corona, Join, and Rooted Product of Graphs

Authors: Vadim E. Levit, Eugen Mandrescu

Abstract:

A graph is well-covered if all its maximal independent sets are of the same size. A well-covered graph is 1-well-covered if deletion of every vertex of the graph leaves it well-covered. It is known that a graph without isolated vertices is 1-well-covered if and only if every two disjoint independent sets are included in two disjoint maximum independent sets. Well-covered graphs are related to combinatorial commutative algebra (e.g., every Cohen-Macaulay graph is well-covered, while each Gorenstein graph without isolated vertices is 1-well-covered). Our intent is to construct several infinite families of 1-well-covered graphs using the following known graph operations: corona, join, and rooted product of graphs. Adopting some known techniques used to advantage for well-covered graphs, one can prove that: if the graph G has no isolated vertices, then the corona of G and H is 1-well-covered if and only if H is a complete graph of order two at least; the join of the graphs G and H is 1-well-covered if and only if G and H have the same independence number and both are 1-well-covered; if H satisfies the property that every three pairwise disjoint independent sets are included in three pairwise disjoint maximum independent sets, then the rooted product of G and H is 1-well-covered, for every graph G. These findings show not only how to generate some more families of 1-well-covered graphs, but also that, to this aim, sometimes, one may use graphs that are not necessarily 1-well-covered.

Keywords: maximum independent set, corona, concatenation, join, well-covered graph

Procedia PDF Downloads 208
3987 A Generic Approach to Reuse Unified Modeling Language Components Following an Agile Process

Authors: Rim Bouhaouel, Naoufel Kraïem, Zuhoor Al Khanjari

Abstract:

Unified Modeling Language (UML) is considered as one of the widespread modeling language standardized by the Object Management Group (OMG). Therefore, the model driving engineering (MDE) community attempts to provide reuse of UML diagrams, and do not construct it from scratch. The UML model appears according to a specific software development process. The existing method generation models focused on the different techniques of transformation without considering the development process. Our work aims to construct an UML component from fragments of UML diagram basing on an agile method. We define UML fragment as a portion of a UML diagram, which express a business target. To guide the generation of fragments of UML models using an agile process, we need a flexible approach, which adapts to the agile changes and covers all its activities. We use the software product line (SPL) to derive a fragment of process agile method. This paper explains our approach, named RECUP, to generate UML fragments following an agile process, and overviews the different aspects. In this paper, we present the approach and we define the different phases and artifacts.

Keywords: UML, component, fragment, agile, SPL

Procedia PDF Downloads 397
3986 Automated Process Quality Monitoring and Diagnostics for Large-Scale Measurement Data

Authors: Hyun-Woo Cho

Abstract:

Continuous monitoring of industrial plants is one of necessary tasks when it comes to ensuring high-quality final products. In terms of monitoring and diagnosis, it is quite critical and important to detect some incipient abnormal events of manufacturing processes in order to improve safety and reliability of operations involved and to reduce related losses. In this work a new multivariate statistical online diagnostic method is presented using a case study. For building some reference models an empirical discriminant model is constructed based on various past operation runs. When a fault is detected on-line, an on-line diagnostic module is initiated. Finally, the status of the current operating conditions is compared with the reference model to make a diagnostic decision. The performance of the presented framework is evaluated using a dataset from complex industrial processes. It has been shown that the proposed diagnostic method outperforms other techniques especially in terms of incipient detection of any faults occurred.

Keywords: data mining, empirical model, on-line diagnostics, process fault, process monitoring

Procedia PDF Downloads 401
3985 Hotel Guests’ Service Fulfillment: Bangkok, Thailand

Authors: Numtana Ladplee, Cherif Haberih

Abstract:

The value of service evaluation depends critically on guests’ understanding of the evaluation objectives and their roles. The present research presents a three-phase investigation of the impact of evaluating participants’ theories about their roles: (a) identifying the theories, (b) testing the process consequences of participants’ role theories, and (c) gaining insights into the impact of participants’ role theories by testing key moderator/s. The findings of this study will hopefully indicate that (a) when forewarned of an upcoming evaluation task, consumers tend to believe that the evaluation objective is to identify aspects that need improvement, (b) this expectation produces a conscious attempt to identify negative aspects, although the encoding of attribute information is not affected, and (c) cognitive load during the evaluation experience greatly decreases the negativity of expected evaluations. The present study can be applied to other market research techniques and thereby improve our understanding of consumer inputs derived from market research. Such insights can help diminish biases produced by participants’ correct or incorrect theories regarding their roles.

Keywords: fulfillment, hotel guests, service, Thailand

Procedia PDF Downloads 276
3984 Groundwater Quality Assessment Using Water Quality Index and Geographical Information System Techniques: A Case Study of Busan City, South Korea

Authors: S. Venkatramanan, S. Y. Chung, S. Selvam, E. E. Hussam, G. Gnanachandrasamy

Abstract:

The quality of groundwater was evaluated by major ions concentration around Busan city, South Korea. The groundwater samples were collected from 40 wells. The order of abundance of major cations concentration in groundwater is Na > Ca > Mg > K, in case of anions are Cl > HCO₃ > SO₄ > NO₃ > F. Based on Piper’s diagram Ca (HCO₃)₂, CaCl₂, and NaCl are the leading groundwater types. While Gibbs diagram suggested that most of groundwater samples belong to rock-weathering zone. Hydrogeochemical condition of groundwater in this city is influenced by evaporation, ion exchange and dissolution of minerals. Water Quality Index (WQI) revealed that 86 % of the samples belong to excellent, 2 % good, 4 % poor to very poor and 8 % unsuitable categories. The results of sodium absorption ratio (SAR), Permeability Index (PI), Residual Sodium Carbonate (RSC) and Magnesium Hazard (MH) exhibit that most of the groundwater samples are suitable for domestic and irrigation purposes.

Keywords: WQI (Water Quality Index), saturation index, groundwater types, ion exchange

Procedia PDF Downloads 263
3983 Subband Coding and Glottal Closure Instant (GCI) Using SEDREAMS Algorithm

Authors: Harisudha Kuresan, Dhanalakshmi Samiappan, T. Rama Rao

Abstract:

In modern telecommunication applications, Glottal Closure Instants location finding is important and is directly evaluated from the speech waveform. Here, we study the GCI using Speech Event Detection using Residual Excitation and the Mean Based Signal (SEDREAMS) algorithm. Speech coding uses parameter estimation using audio signal processing techniques to model the speech signal combined with generic data compression algorithms to represent the resulting modeled in a compact bit stream. This paper proposes a sub-band coder SBC, which is a type of transform coding and its performance for GCI detection using SEDREAMS are evaluated. In SBCs code in the speech signal is divided into two or more frequency bands and each of these sub-band signal is coded individually. The sub-bands after being processed are recombined to form the output signal, whose bandwidth covers the whole frequency spectrum. Then the signal is decomposed into low and high-frequency components and decimation and interpolation in frequency domain are performed. The proposed structure significantly reduces error, and precise locations of Glottal Closure Instants (GCIs) are found using SEDREAMS algorithm.

Keywords: SEDREAMS, GCI, SBC, GOI

Procedia PDF Downloads 356
3982 Application of Refractometric Methodology for Simultaneous Determination of Alcohol and Residual Sugar Concentrations during Alcoholic Fermentation Bioprocess of Date Juice

Authors: Boukhiar Aissa, Halladj Fatima, Iguergaziz Nadia, Lamrani yasmina, Benamara Salem

Abstract:

Determining the alcohol content in alcoholic fermentation bioprocess is of great importance. In fact, it is a key indicator for monitoring this bioprocess. Several methodologies (chemical, spectrophotometric, chromatographic) are used to the determination of this parameter. However, these techniques are very long and they require: rigorous preparations, sometimes dangerous chemical reagents and/or expensive equipment. In the present study, the date juice is used as the substrate of alcoholic fermentation. The extracted juice undergoes an alcoholic fermentation by Saccharomyces cerevisiae. The study of the possible use of refractometry as a sole means for the in situ control of alcoholic fermentation revealed a good correlation (R2=0.98) between initial and final °Brix: °Brixf=0.377×°Brixi. In addition, the relationship between Δ°Brix and alcoholic content of the final product (A,%) has been determined: Δ°Brix/A=1.1. The obtained results allowed us to establish iso-responses abacus, which can be used for the determination of alcohol and residual sugar content, with a mean relative error (MRE) of 5.35%.

Keywords: alcoholic fermentation, date juice, refractometry, residual sugar

Procedia PDF Downloads 341
3981 Deflection Effect on Mirror for Space Applications

Authors: Maamar Fatouma

Abstract:

Mirror optical performance can experience varying levels of stress and tolerances, which can have a notable impact on optical parametric systems. to ensure proper optical figure and position of mirror mounting within design tolerances, it is crucial to have a robust support structure in place for optical systems. The optical figure tolerance determines the allowable deviation from the ideal form of the mirror and the position tolerance determines the location and orientations of the optical axis of the optical systems. A variety of factors influence the optical figure of the mirror. Included are self-weight (Deflection), excitation from temperature change, temperature gradients and dimensional instability. This study employs an analytical approach and finite element method to examine the effects of stress resulting from mirror mounting on the wavefront passing through the mirror. The combined effect of tolerance and deflection on mirror performance is represented by an error budget. Numerical mirror mounting is presented to illustrate the space application of performance techniques.

Keywords: opto-mechanical, bonded optic, tolerance, self-weight distortion, Rayleigh criteria

Procedia PDF Downloads 89
3980 Artificial Neural Networks Face to Sudden Load Change for Shunt Active Power Filter

Authors: Dehini Rachid, Ferdi Brahim

Abstract:

The shunt active power filter (SAPF) is not destined only to improve the power factor, but also to compensate the unwanted harmonic currents produced by nonlinear loads. This paper presents a SAPF with identification and control method based on artificial neural network (ANN). To identify harmonics, many techniques are used, among them the conventional p-q theory and the relatively recent one the artificial neural network method. It is difficult to get satisfied identification and control characteristics by using a normal (ANN) due to the nonlinearity of the system (SAPF + fast nonlinear load variations). This work is an attempt to undertake a systematic study of the problem to equip the (SAPF) with the harmonics identification and DC link voltage control method based on (ANN). The latter has been applied to the (SAPF) with fast nonlinear load variations. The results of computer simulations and experiments are given, which can confirm the feasibility of the proposed active power filter.

Keywords: artificial neural networks (ANN), p-q theory, harmonics, total harmonic distortion

Procedia PDF Downloads 386
3979 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 195
3978 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 640
3977 Dynamic Background Updating for Lightweight Moving Object Detection

Authors: Kelemewerk Destalem, Joongjae Cho, Jaeseong Lee, Ju H. Park, Joonhyuk Yoo

Abstract:

Background subtraction and temporal difference are often used for moving object detection in video. Both approaches are computationally simple and easy to be deployed in real-time image processing. However, while the background subtraction is highly sensitive to dynamic background and illumination changes, the temporal difference approach is poor at extracting relevant pixels of the moving object and at detecting the stopped or slowly moving objects in the scene. In this paper, we propose a moving object detection scheme based on adaptive background subtraction and temporal difference exploiting dynamic background updates. The proposed technique consists of a histogram equalization, a linear combination of background and temporal difference, followed by the novel frame-based and pixel-based background updating techniques. Finally, morphological operations are applied to the output images. Experimental results show that the proposed algorithm can solve the drawbacks of both background subtraction and temporal difference methods and can provide better performance than that of each method.

Keywords: background subtraction, background updating, real time, light weight algorithm, temporal difference

Procedia PDF Downloads 342
3976 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 299
3975 The Interfaith Dialogue by William Milne by the First Chinese Study Bible

Authors: Liu Yuan-Jian, Chou Fu-Chu

Abstract:

The study Bible was published in 1825 after Milne’s death, containing large amounts of paraphrasing, exhortations, notes, and commentaries to facilitate readers' scripture engagement. The methodologies employed include text analysis and discourse analysis. This study shows that to enable Chinese readers, uninitiated in the Gospel and deeply influenced by Confucian ethics and paganism, to understand the Bible and apply it to their daily living, Milne not only paraphrased the verses but also used metaphors and rhetorical techniques for explaining the background information of the Bible, teaching biblical doctrine, combating paganism, and exhorting readers to believe in the Gospel. Moreover, Milne also tries to clarify the scripture in the context of Chinese culture, giving the readers a clear way to put the scripture into practice in their daily living. His exposition had successfully made a breakthrough from the British and Foreign Bible Society's “Without Note or Comment” principle and showed a useful instrument for promoting interfaith dialogue.

Keywords: interfaith dialogue, William Milne, Chinese study Bible, exposition, “Without Note or Comment” principle

Procedia PDF Downloads 83
3974 Selenium Content in Agricultural Soils and Wheat from the Balkan Peninsula

Authors: S. Krustev, V. Angelova, P. Zaprjanova

Abstract:

Selenium (Se) is an essential micro-nutrient for human and animals but it is highly toxic. Its organic compounds play an important role in biochemistry and nutrition of the cells. Concentration levels of this element in the different regions of the world vary considerably. This study aimed to compare the availability and levels of the Se in some rural areas of the Balkan Peninsula and relationship with the concentrations of other trace elements. For this purpose soil samples and wheat grains from different regions of Bulgaria, Serbia, Nord Macedonia, Romania, and Greece situated far from large industrial centers have been analyzed. The main methods for their determination were the atomic spectral techniques – atomic absorption and plasma atomic emission. As a result of this study, data on microelements levels from the main grain-producing regions of the Balkan Peninsula were determined and systematized. The presented results confirm the low levels of Se in this region: 0.222– 0.962 mg.kg-1 in soils and 0.001 - 0.005 mg.kg-1 in wheat grains and require measures to offset the effect of this deficiency.

Keywords: agricultural soils, balkan peninsula, rural areas, selenium

Procedia PDF Downloads 132
3973 Novel Self-Healing Eco-Friendly Coatings with Antifouling and Anticorrosion Properties for Maritime Applications

Authors: K. N. Kipreou, E. Efthmiadou, G. Kordas

Abstract:

Biofouling represents one of the most crucial problems in the present maritime industries when its control still challenges the researchers all over the world. The present work is referred to the synthesis and characterization CeMo and Cu2O nanocontainers by using a wide range of techniques including scanning electron microscopy (SEM), X-ray diffraction (XRD) and thermogravimetric analysis (TGA) for marine applications. The above nanosystems will be loaded with active monomers and corrosion rendering healing ability to marine paints. The objective of this project is their ability for self-healing, self-polishing and finally for anti-corrosion activity. One of the driving forces for the exploration of CeMo, is the unique anticorrosive behavior, which will be confirmed by the electrochemistry methodology. It has be highlighted that the nanocontainers of Cu2O with the appropriate antibacterial inhibitor will improve the hydrophobicity and the morphology of the coating surfaces reducing the water friction. In summary, both novel nanoc will increase the lifetime of the paints releasing the antifouling agent in a control manner.

Keywords: marinepaints, nanocontainer, antifouling, anticorrosion, copper, electrochemistry, coating, biofouling, inhibitors, copper oxide, coating, SEM

Procedia PDF Downloads 338
3972 Reactive and Concurrency-Based Image Resource Management Module for iOS Applications

Authors: Shubham V. Kamdi

Abstract:

This paper aims to serve as an introduction to image resource caching techniques for iOS mobile applications. It will explain how developers can break down multiple image-downloading tasks concurrently using state-of-the-art iOS frameworks, namely Swift Concurrency and Combine. The paper will explain how developers can leverage SwiftUI to develop reactive view components and use declarative coding patterns. Developers will learn to bypass built-in image caching systems by curating the procedure to implement a swift-based LRU cache system. The paper will provide a full architectural overview of a system, helping readers understand how mobile applications are designed professionally. It will cover technical discussion, helping readers understand the low-level details of threads and how they can switch between them, as well as the significance of the main and background threads for requesting HTTP services via mobile applications.

Keywords: main thread, background thread, reactive view components, declarative coding

Procedia PDF Downloads 25
3971 'Low Electronic Noise' Detector Technology in Computed Tomography

Authors: A. Ikhlef

Abstract:

Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.

Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector

Procedia PDF Downloads 126
3970 Challenges in Employment and Adjustment of Academic Expatriates Based in Higher Education Institutions in the KwaZulu-Natal Province, South Africa

Authors: Thulile Ndou

Abstract:

The purpose of this study was to examine the challenges encountered in the mediation of attracting and recruiting academic expatriates who in turn encounter their own obstacles in adjusting into and settling in their host country, host academic institutions and host communities. The none-existence of literature on attraction, placement and management of academic expatriates in the South African context has been acknowledged. Moreover, Higher Education Institutions in South Africa have voiced concerns relating to delayed and prolonged recruitment and selection processes experienced in the employment process of academic expatriates. Once employed, academic expatriates should be supported and acquainted with the surroundings, the local communities as well as be assisted to establish working relations with colleagues in order to facilitate their adjustment and integration process. Hence, an employer should play a critical role in facilitating the adjustment of academic expatriates. This mixed methods study was located in four Higher Education Institutions based in the KwaZulu-Natal province, in South Africa. The explanatory sequential design approach was deployed in the study. The merits of this approach were chiefly that it employed both the quantitative and qualitative techniques of inquiry. Therefore, the study examined and interrogated its subject from a multiplicity of quantitative and qualitative vantage points, yielding a much more enriched and enriching illumination. Mixing the strengths of both the quantitative and the qualitative techniques delivered much more durable articulation and understanding of the subject. A 5-point Likert scale questionnaire was used to collect quantitative data relating to interaction adjustment, general adjustment and work adjustment from academic expatriates. One hundred and forty two (142) academic expatriates participated in the quantitative study. Qualitative data relating to employment process and support offered to academic expatriates was collected through a structured questionnaire and semi-structured interviews. A total of 48 respondents; including, line managers, human resources practitioners, and academic expatriates participated in the qualitative study. The Independent T-test, ANOVA and Descriptive Statistics were performed to analyse, interpret and make meaning of quantitative data and thematic analysis was used to analyse qualitative data. The qualitative results revealed that academic talent is sourced from outside the borders of the country because of the academic skills shortage in almost all academic disciplines especially in the disciplines associated with Science, Engineering and Accounting. However, delays in work permit application process made it difficult to finalise the recruitment and selection process on time. Furthermore, the quantitative results revealed that academic expatriates experience general and interaction adjustment challenges associated with the use of local language and understanding of local culture. However, female academic expatriates were found to be better adjusted in the two areas as compared to male academic expatriates. Moreover, significant mean differences were found between institutions suggesting that academic expatriates based in rural areas experienced adjustment challenges differently from the academic expatriates based in urban areas. The study gestured to the need for policy revisions in the area of immigration, human resources and academic administration.

Keywords: academic expatriates, recruitment and selection, interaction and general adjustment, work adjustment

Procedia PDF Downloads 306
3969 Designing and Evaluating Pedagogic Conversational Agents to Teach Children

Authors: Silvia Tamayo-Moreno, Diana Pérez-Marín

Abstract:

In this paper, the possibility of children studying by using an interactive learning technology called Pedagogic Conversational Agent is presented. The main benefit is that the agent is able to adapt the dialogue to each student and to provide automatic feedback. Moreover, according to Math teachers, in many cases students are unable to solve the problems even knowing the procedure to solve them, because they do not understand what they have to do. The hypothesis is that if students are helped to understand what they have to solve, they will be able to do it. Taken that into account, we have started the development of Dr. Roland, an agent to help students understand Math problems following a User-Centered Design methodology. The use of this methodology is proposed, for the first time, to design pedagogic agents to teach any subject from Secondary down to Pre-Primary education. The reason behind proposing a methodology is that while working on this project, we noticed the lack of literature to design and evaluate agents. To cover this gap, we describe how User-Centered Design can be applied, and which usability techniques can be applied to evaluate the agent.

Keywords: pedagogic conversational agent, human-computer interaction, user-centered design, natural language interface

Procedia PDF Downloads 323
3968 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors

Authors: Yaxin Bi

Abstract:

Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.

Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors

Procedia PDF Downloads 32
3967 A Hyperflexion Hallux Mallet Injury: A Case Report

Authors: Tan G. K. Y., Chew M. S. J., Sajeev S., Vellasamy A.

Abstract:

Injuries of the extensor hallucis longus (EHL) tendon are a rare phenomenon, with most occurring due to lacerations or penetrating injuries. Closed traumatic ruptures of the EHL are described as “Mallet injuries of the toe”. These can be classified as bony or soft mallet injuries depending on the presence or absence of a fracture at the insertion site of the EHL tendon in the distal phalanx. We present a case of a 33-year-old woman who presented with a hyperflexion injury to the left big toe with an inability to extend the big toe. Ultrasound showed a complete rupture of the EHL tendon with retraction proximal to the hallucal interphalangeal joint of the big toe. The patient was treated through transarticular pinning and repair using the Arthrex Mini Bio-Suture Tak with a 2-0 fibre wire. Six months postoperatively, the patient had symmetrical EHL power and full range of motion of the toe. The lessons to be drawn from this case report are that isolated hallux mallet injuries are rare and can be easily missed in the absence of penetrating wounds. Patients who have such injuries should be investigated early with the appropriate imaging techniques, such as ultrasound or MRI, and treated surgically.

Keywords: hallux mallet, extensor hallucis longus tendon, extensor hallucis longus

Procedia PDF Downloads 79
3966 Microwave Assisted Extractive Desulfurization of Gas Oil Feedstock

Authors: Hamida Y. Mostafa, Ghada E. Khedr, Dina M. Abd El-Aty

Abstract:

Sulfur compound removal from petroleum fractions is a critical component of environmental protection demands. Solvent extraction, oxidative desulfurization, or hydro-treatment techniques have traditionally been used as the removal processes. While all methods were capable of eliminating sulfur compounds at moderate rates, they had some limitations. A major problem with these routes is their high running expenses, which are caused by their prolonged operation times and high energy consumption. Therefore, new methods for removing sulfur are still necessary. In the current study, a simple assisted desulfurization system for gas oil fraction has been successfully developed using acetonitrile and methanol as a solvent under microwave irradiation. The key variables affecting sulfur removal have been studied, including microwave power, irradiation time, and solvent to gas oil volume ratio. At the conclusion of the research that is being presented, promising results have been found. The results show that a microwave-assisted extractive desulfurization method had remove sulfur with a high degree of efficiency under the suitable conditions.

Keywords: extractive desulfurization, microwave assisted extraction, petroleum fractions, acetonitrile and methanol

Procedia PDF Downloads 103
3965 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping

Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa

Abstract:

The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.

Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories

Procedia PDF Downloads 283