Search results for: real GDP
1972 Enhanced Flight Dynamics Model to Simulate the Aircraft Response to Gust Encounters
Authors: Castells Pau, Poetsch Christophe
Abstract:
The effect of gust and turbulence encounters on aircraft is a wide field of study which allows different approaches, from high-fidelity multidisciplinary simulations to more simplified models adapted to industrial applications. The typical main goal is to predict the gust loads on the aircraft in order to ensure a safe design and achieve certification. Another topic widely studied is the gust loads reduction through an active control law. The impact of gusts on aircraft handling qualities is of interest as well in the analysis of in-service events so as to evaluate the aircraft response and the performance of the flight control laws. Traditionally, gust loads and handling qualities are addressed separately with different models adapted to the specific needs of each discipline. In this paper, an assessment of the differences between both models is presented and a strategy to better account for the physics of gust encounters in a typical flight dynamics model is proposed based on the model used for gust loads analysis. The applied corrections aim to capture the gust unsteady aerodynamics and propagation as well as the effect of dynamic flexibility at low frequencies. Results from the gust loads model at different flight conditions and measures from real events are used for validation. An assessment of a possible extension of steady aerodynamic nonlinearities to low frequency range is also addressed. The proposed corrections provide meaningful means to evaluate the performance and possible adjustments of the flight control laws.Keywords: flight dynamics, gust loads, handling qualities, unsteady aerodynamics
Procedia PDF Downloads 1471971 Enhanced Planar Pattern Tracking for an Outdoor Augmented Reality System
Authors: L. Yu, W. K. Li, S. K. Ong, A. Y. C. Nee
Abstract:
In this paper, a scalable augmented reality framework for handheld devices is presented. The presented framework is enabled by using a server-client data communication structure, in which the search for tracking targets among a database of images is performed on the server-side while pixel-wise 3D tracking is performed on the client-side, which, in this case, is a handheld mobile device. Image search on the server-side adopts a residual-enhanced image descriptors representation that gives the framework a scalability property. The tracking algorithm on the client-side is based on a gravity-aligned feature descriptor which takes the advantage of a sensor-equipped mobile device and an optimized intensity-based image alignment approach that ensures the accuracy of 3D tracking. Automatic content streaming is achieved by using a key-frame selection algorithm, client working phase monitoring and standardized rules for content communication between the server and client. The recognition accuracy test performed on a standard dataset shows that the method adopted in the presented framework outperforms the Bag-of-Words (BoW) method that has been used in some of the previous systems. Experimental test conducted on a set of video sequences indicated the real-time performance of the tracking system with a frame rate at 15-30 frames per second. The presented framework is exposed to be functional in practical situations with a demonstration application on a campus walk-around.Keywords: augmented reality framework, server-client model, vision-based tracking, image search
Procedia PDF Downloads 2751970 Reflections of Nocturnal Librarian: Attaining a Work-Life Balance in a Mega-City of Lagos State Nigeria
Authors: Oluwole Durodolu
Abstract:
The rationale for this study is to explore the adaptive strategy that librarians adopt in performing night shifts in a mega-city like Lagos state. Maslach Burnout Theory would be used to measure the three proportions of burnout in understanding emotional exhaustion, depersonalisation, and individual accomplishment to scrutinise job-related burnout syndrome allied with longstanding, unsolved stress. The qualitative methodology guided by a phenomenological research paradigm, which is an approach that focuses on the commonality of real-life experience in a particular group, would be used, focus group discussion adopted as a method of data collection from library staff who are involved in night-shift. The participant for the focus group discussion would be selected using a convenience sampling technique in which staff at the cataloguing unit would be included in the sample because of the representative characteristics of the unit. This would be done to enable readers to understand phenomena as it is reasonable than from a remote perspective. The exploratory interviews which will be in focus group method to shed light on issues relating to security, housing, transportation, budgeting, energy supply, employee duties, time management, information access, and sustaining professional levels of service and how all these variables affect the productivity of all the 149 library staff and their work-life balance.Keywords: nightshift, work-life balance, mega-city, academic library, Maslach Burnout Theory, Lagos State, University of Lagos
Procedia PDF Downloads 1321969 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference
Procedia PDF Downloads 1081968 Hand Gesture Interface for PC Control and SMS Notification Using MEMS Sensors
Authors: Keerthana E., Lohithya S., Harshavardhini K. S., Saranya G., Suganthi S.
Abstract:
In an epoch of expanding human-machine interaction, the development of innovative interfaces that bridge the gap between physical gestures and digital control has gained significant momentum. This study introduces a distinct solution that leverages a combination of MEMS (Micro-Electro-Mechanical Systems) sensors, an Arduino Mega microcontroller, and a PC to create a hand gesture interface for PC control and SMS notification. The core of the system is an ADXL335 MEMS accelerometer sensor integrated with an Arduino Mega, which communicates with a PC via a USB cable. The ADXL335 provides real-time acceleration data, which is processed by the Arduino to detect specific hand gestures. These gestures, such as left, right, up, down, or custom patterns, are interpreted by the Arduino, and corresponding actions are triggered. In the context of SMS notifications, when a gesture indicative of a new SMS is recognized, the Arduino relays this information to the PC through the serial connection. The PC application, designed to monitor the Arduino's serial port, displays these SMS notifications in the serial monitor. This study offers an engaging and interactive means of interfacing with a PC by translating hand gestures into meaningful actions, opening up opportunities for intuitive computer control. Furthermore, the integration of SMS notifications adds a practical dimension to the system, notifying users of incoming messages as they interact with their computers. The use of MEMS sensors, Arduino, and serial communication serves as a promising foundation for expanding the capabilities of gesture-based control systems.Keywords: hand gestures, multiple cables, serial communication, sms notification
Procedia PDF Downloads 691967 The Effect of Power of Isolation Transformer on the Lamps in Airfield Ground Lighting Systems
Authors: Hossein Edrisi
Abstract:
To study the impact of the amount and volume of power of isolation transformer on the lamps in airfield Ground Lighting Systems. A test was conducted in Persian Gulf International Airport, This airport is situated in the south of Iran and it is one of the most cutting-edge airports, the same one that owns modern devices. Iran uses materials and auxiliary equipment which are made by ADB Company from Belgium. Airfield ground lighting (AGL) systems are responsible for providing visual issue to aircrafts and helicopters in the runways. In an AGL system a great deal of lamps are connected in serial circuits to each other and each ring has its individual constant current regulators (CCR), which through that provide energy to the lamps. Control of lamps is crucial for maintenance and operation in the AGL systems. Thanks to the Programmable Logic Controller (PLC) that is a cutting-edge technology can help the system to connect the elements from substations and ATC (TOWER). For this purpose, a test in real conditions of the airport done for all element that used in the airport such as isolation transformer in different power capacity and different consuming power and brightness of the lamps. The data were analyzed with Lux meter and Multimeter. The results had shown that the increase in the power of transformer caused a significant increase in brightness. According to the Ohm’s law and voltage division, without changing the characteristics of the light bulb, it is not possible to change the voltage, just need to change the amount of transformer with which it connects to the lamps. When the voltage is increased, the current through the bulb has to increase as well, because of Ohm's law: I=V/R and I=V/R which means that if V increases, so do I increase. The output voltage on the constant current regulator emerges between the lamps and the transformers.Keywords: AGL, CCR, lamps, transformer, Ohm’s law
Procedia PDF Downloads 2481966 Innovative In-Service Training Approach to Strengthen Health Care Human Resources and Scale-Up Detection of Mycobacterium tuberculosis
Authors: Tsegahun Manyazewal, Francesco Marinucci, Getachew Belay, Abraham Tesfaye, Gonfa Ayana, Amaha Kebede, Tsegahun Manyazewal, Francesco Marinucci, Getachew Belay, Abraham Tesfaye, Gonfa Ayana, Amaha Kebede, Yewondwossen Tadesse, Susan Lehman, Zelalem Temesgen
Abstract:
In-service health trainings in Sub-Saharan Africa are mostly content-centered with higher disconnection with the real practice in the facility. This study intended to evaluate in-service training approach aimed to strengthen health care human resources. A combined web-based and face-to-face training was designed and piloted in Ethiopia with the diagnosis of tuberculosis. During the first part, which lasted 43 days, trainees accessed web-based material and read without leaving their work; while the second part comprised a one-day hands-on evaluation. Trainee’s competency was measured using multiple-choice questions, written-assignments, exercises and hands-on evaluation. Of 108 participants invited, 81 (75%) attended the course and 71 (88%) of them successfully completed. Of those completed, 73 (90%) scored a grade from A to C. The approach was effective to transfer knowledge and turn it into practical skills. In-service health training should transform from a passive one-time-event to a continuous behavioral change of participants and improvements on their actual work.Keywords: Ethiopia, health care, Mycobacterium tuberculosis, training
Procedia PDF Downloads 5041965 Tomato-Weed Classification by RetinaNet One-Step Neural Network
Authors: Dionisio Andujar, Juan lópez-Correa, Hugo Moreno, Angela Ri
Abstract:
The increased number of weeds in tomato crops highly lower yields. Weed identification with the aim of machine learning is important to carry out site-specific control. The last advances in computer vision are a powerful tool to face the problem. The analysis of RGB (Red, Green, Blue) images through Artificial Neural Networks had been rapidly developed in the past few years, providing new methods for weed classification. The development of the algorithms for crop and weed species classification looks for a real-time classification system using Object Detection algorithms based on Convolutional Neural Networks. The site study was located in commercial corn fields. The classification system has been tested. The procedure can detect and classify weed seedlings in tomato fields. The input to the Neural Network was a set of 10,000 RGB images with a natural infestation of Cyperus rotundus l., Echinochloa crus galli L., Setaria italica L., Portulaca oeracea L., and Solanum nigrum L. The validation process was done with a random selection of RGB images containing the aforementioned species. The mean average precision (mAP) was established as the metric for object detection. The results showed agreements higher than 95 %. The system will provide the input for an online spraying system. Thus, this work plays an important role in Site Specific Weed Management by reducing herbicide use in a single step.Keywords: deep learning, object detection, cnn, tomato, weeds
Procedia PDF Downloads 1031964 Rapid, Label-Free, Direct Detection and Quantification of Escherichia coli Bacteria Using Nonlinear Acoustic Aptasensor
Authors: Shilpa Khobragade, Carlos Da Silva Granja, Niklas Sandström, Igor Efimov, Victor P. Ostanin, Wouter van der Wijngaart, David Klenerman, Sourav K. Ghosh
Abstract:
Rapid, label-free and direct detection of pathogenic bacteria is critical for the prevention of disease outbreaks. This paper for the first time attempts to probe the nonlinear acoustic response of quartz crystal resonator (QCR) functionalized with specific DNA aptamers for direct detection and quantification of viable E. coli KCTC 2571 bacteria. DNA aptamers were immobilized through biotin and streptavidin conjugation, onto the gold surface of QCR to capture the target bacteria and the detection was accomplished by shift in amplitude of the peak 3f signal (3 times the drive frequency) upon binding, when driven near fundamental resonance frequency. The developed nonlinear acoustic aptasensor system demonstrated better reliability than conventional resonance frequency shift and energy dissipation monitoring that were recorded simultaneously. This sensing system could directly detect 10⁽⁵⁾ cells/mL target bacteria within 30 min or less and had high specificity towards E. coli KCTC 2571 bacteria as compared to the same concentration of S.typhi bacteria. Aptasensor response was observed for the bacterial suspensions ranging from 10⁽⁵⁾-10⁽⁸⁾ cells/mL. Conclusively, this nonlinear acoustic aptasensor is simple to use, gives real-time output, cost-effective and has the potential for rapid, specific, label-free direction detection of bacteria.Keywords: acoustic, aptasensor, detection, nonlinear
Procedia PDF Downloads 5671963 Development of an Auxetic Tissue Implant
Authors: Sukhwinder K. Bhullar, M. B. G. Jun
Abstract:
The developments in biomedical industry have demanded the development of biocompatible, high performance materials to meet higher engineering specifications. The general requirements of such materials are to provide a combination of high stiffness and strength with significant weight savings, resistance to corrosion, chemical resistance, low maintenance, and reduced costs. Auxetic materials which come under the category of smart materials offer huge potential through measured enhancements in mechanical properties. Unique deformation mechanism, providing cushioning on indentation, automatically adjustable with its strength and thickness in response to forces and having memory returns to its neutral state on dissipation of stresses make them good candidate in biomedical industry. As simple extension and compression of tissues is of fundamental importance in biomechanics, therefore, to study the elastic behaviour of auxetic soft tissues implant is targeted in this paper. Therefore development and characterization of auxetic soft tissue implant is studied in this paper. This represents a real life configuration where soft tissue such as meniscus in knee replacement, ligaments and tendons often are taken as transversely isotropic. Further, as composition of alternating polydisperse blocks of soft and stiff segments combined with excellent biocompatibility make polyurethanes one of the most promising synthetic biomaterials. Hence selecting auxetic polyurathylene foam functional characterization is performed and compared with conventional polyurathylene foam.Keywords: auxetic materials, deformation mechanism, enhanced mechanical properties, soft tissues
Procedia PDF Downloads 4591962 Progress in Combining Image Captioning and Visual Question Answering Tasks
Authors: Prathiksha Kamath, Pratibha Jamkhandi, Prateek Ghanti, Priyanshu Gupta, M. Lakshmi Neelima
Abstract:
Combining Image Captioning and Visual Question Answering (VQA) tasks have emerged as a new and exciting research area. The image captioning task involves generating a textual description that summarizes the content of the image. VQA aims to answer a natural language question about the image. Both these tasks include computer vision and natural language processing (NLP) and require a deep understanding of the content of the image and semantic relationship within the image and the ability to generate a response in natural language. There has been remarkable growth in both these tasks with rapid advancement in deep learning. In this paper, we present a comprehensive review of recent progress in combining image captioning and visual question-answering (VQA) tasks. We first discuss both image captioning and VQA tasks individually and then the various ways in which both these tasks can be integrated. We also analyze the challenges associated with these tasks and ways to overcome them. We finally discuss the various datasets and evaluation metrics used in these tasks. This paper concludes with the need for generating captions based on the context and captions that are able to answer the most likely asked questions about the image so as to aid the VQA task. Overall, this review highlights the significant progress made in combining image captioning and VQA, as well as the ongoing challenges and opportunities for further research in this exciting and rapidly evolving field, which has the potential to improve the performance of real-world applications such as autonomous vehicles, robotics, and image search.Keywords: image captioning, visual question answering, deep learning, natural language processing
Procedia PDF Downloads 731961 Propeller Performance Modeling through a Computational Fluid Dynamics Analysis Method
Authors: Maxime Alex Junior Kuitche, Ruxandra Mihaela Botez, Jean-Chirstophe Maunand
Abstract:
The evolution of aircraft is closely linked to the study and improvement of propulsion systems. Determining the propulsion performance is a real challenge in aircraft modeling and design. In addition to theoretical methodologies, experimental procedures are used to obtain a good estimation of the propulsion performances. For piston-propeller propulsion, the propeller needs several experimental tests which could be extremely demanding in terms of time and money. This paper presents a new procedure to estimate the performance of a propeller from a numerical approach using computational fluid dynamic analysis. The propeller was initially scanned, and then, its 3D model was represented using CATIA. A structured meshing and Shear Stress Transition k-ω turbulence model were applied to describe accurately the flow pattern around the propeller. Thus, the Partial Differential Equations were solved using ANSYS FLUENT software. The method was applied on the UAS-S45’s propeller designed and manufactured by Hydra Technologies in Mexico. An extensive investigation was performed for several flight conditions in terms of altitudes and airspeeds with the aim to determine thrust coefficients, power coefficients and efficiency of the propeller. The Computational Fluid Dynamics results were compared with experimental data acquired from wind tunnel tests performed at the LARCASE Price-Paidoussis wind tunnel. The results of this comparison have demonstrated that our approach was highly accurate.Keywords: CFD analysis, propeller performance, unmanned aerial system propeller, UAS-S45
Procedia PDF Downloads 3531960 Refined Edge Detection Network
Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni
Abstract:
Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone
Procedia PDF Downloads 1021959 Review of Theories and Applications of Genetic Programing in Sediment Yield Modeling
Authors: Adesoji Tunbosun Jaiyeola, Josiah Adeyemo
Abstract:
Sediment yield can be considered to be the total sediment load that leaves a drainage basin. The knowledge of the quantity of sediments present in a river at a particular time can lead to better flood capacity in reservoirs and consequently help to control over-bane flooding. Furthermore, as sediment accumulates in the reservoir, it gradually loses its ability to store water for the purposes for which it was built. The development of hydrological models to forecast the quantity of sediment present in a reservoir helps planners and managers of water resources systems, to understand the system better in terms of its problems and alternative ways to address them. The application of artificial intelligence models and technique to such real-life situations have proven to be an effective approach of solving complex problems. This paper makes an extensive review of literature relevant to the theories and applications of evolutionary algorithms, and most especially genetic programming. The successful applications of genetic programming as a soft computing technique were reviewed in sediment modelling and other branches of knowledge. Some fundamental issues such as benchmark, generalization ability, bloat and over-fitting and other open issues relating to the working principles of GP, which needs to be addressed by the GP community were also highlighted. This review aim to give GP theoreticians, researchers and the general community of GP enough research direction, valuable guide and also keep all stakeholders abreast of the issues which need attention during the next decade for the advancement of GP.Keywords: benchmark, bloat, generalization, genetic programming, over-fitting, sediment yield
Procedia PDF Downloads 4461958 A Mathematical Agent-Based Model to Examine Two Patterns of Language Change
Authors: Gareth Baxter
Abstract:
We use a mathematical model of language change to examine two recently observed patterns of language change: one in which most speakers change gradually, following the mean of the community change, and one in which most individuals use predominantly one variant or another, and change rapidly if they change at all. The model is based on Croft’s Utterance Selection account of language change, which views language change as an evolutionary process, in which different variants (different ‘ways of saying the same thing’) compete for usage in a population of speakers. Language change occurs when a new variant replaces an older one as the convention within a given population. The present model extends a previous simpler model to include effects related to speaker aging and interspeaker variation in behaviour. The two patterns of individual change (one more centralized and the other more polarized) were recently observed in historical language changes, and it was further observed that slower changes were more associated with the centralized pattern, while quicker changes were more polarized. Our model suggests that the two patterns of change can be explained by different balances between the preference of speakers to use one variant over another and the degree of accommodation to (propensity to adapt towards) other speakers. The correlation with the rate of change appears naturally in our model, and results from the fact that both differential weighting of variants and the degree of accommodation affect the time for change to occur, while also determining the patterns of change. This work represents part of an ongoing effort to examine phenomena in language change through the use of mathematical models. This offers another way to evaluate qualitative explanations that cannot be practically tested (or cannot be tested at all) in a real-world, large-scale speech community.Keywords: agent based modeling, cultural evolution, language change, social behavior modeling, social influence
Procedia PDF Downloads 2351957 Spatial Object-Oriented Template Matching Algorithm Using Normalized Cross-Correlation Criterion for Tracking Aerial Image Scene
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
Leaning on the development of aerial laser scanning in the Philippine geospatial industry, researches about remote sensing and machine vision technology became a trend. Object detection via template matching is one of its application which characterized to be fast and in real time. The paper purposely attempts to provide application for robust pattern matching algorithm based on the normalized cross correlation (NCC) criterion function subjected in Object-based image analysis (OBIA) utilizing high-resolution aerial imagery and low density LiDAR data. The height information from laser scanning provides effective partitioning order, thus improving the hierarchal class feature pattern which allows to skip unnecessary calculation. Since detection is executed in the object-oriented platform, mathematical morphology and multi-level filter algorithms were established to effectively avoid the influence of noise, small distortion and fluctuating image saturation that affect the rate of recognition of features. Furthermore, the scheme is evaluated to recognized the performance in different situations and inspect the computational complexities of the algorithms. Its effectiveness is demonstrated in areas of Misamis Oriental province, achieving an overall accuracy of 91% above. Also, the garnered results portray the potential and efficiency of the implemented algorithm under different lighting conditions.Keywords: algorithm, LiDAR, object recognition, OBIA
Procedia PDF Downloads 2451956 Classification of Hyperspectral Image Using Mathematical Morphological Operator-Based Distance Metric
Authors: Geetika Barman, B. S. Daya Sagar
Abstract:
In this article, we proposed a pixel-wise classification of hyperspectral images using a mathematical morphology operator-based distance metric called “dilation distance” and “erosion distance”. This method involves measuring the spatial distance between the spectral features of a hyperspectral image across the bands. The key concept of the proposed approach is that the “dilation distance” is the maximum distance a pixel can be moved without changing its classification, whereas the “erosion distance” is the maximum distance that a pixel can be moved before changing its classification. The spectral signature of the hyperspectral image carries unique class information and shape for each class. This article demonstrates how easily the dilation and erosion distance can measure spatial distance compared to other approaches. This property is used to calculate the spatial distance between hyperspectral image feature vectors across the bands. The dissimilarity matrix is then constructed using both measures extracted from the feature spaces. The measured distance metric is used to distinguish between the spectral features of various classes and precisely distinguish between each class. This is illustrated using both toy data and real datasets. Furthermore, we investigated the role of flat vs. non-flat structuring elements in capturing the spatial features of each class in the hyperspectral image. In order to validate, we compared the proposed approach to other existing methods and demonstrated empirically that mathematical operator-based distance metric classification provided competitive results and outperformed some of them.Keywords: dilation distance, erosion distance, hyperspectral image classification, mathematical morphology
Procedia PDF Downloads 871955 Development of Automatic Laser Scanning Measurement Instrument
Authors: Chien-Hung Liu, Yu-Fen Chen
Abstract:
This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW
Procedia PDF Downloads 3601954 Analysis of Accurate Direct-Estimation of the Maximum Power Point and Thermal Characteristics of High Concentration Photovoltaic Modules
Authors: Yan-Wen Wang, Chu-Yang Chou, Jen-Cheng Wang, Min-Sheng Liao, Hsuan-Hsiang Hsu, Cheng-Ying Chou, Chen-Kang Huang, Kun-Chang Kuo, Joe-Air Jiang
Abstract:
Performance-related parameters of high concentration photovoltaic (HCPV) modules (e.g. current and voltage) are required when estimating the maximum power point using numerical and approximation methods. The maximum power point on the characteristic curve for a photovoltaic module varies when temperature or solar radiation is different. It is also difficult to estimate the output performance and maximum power point (MPP) due to the special characteristics of HCPV modules. Based on the p-n junction semiconductor theory, a brand new and simple method is presented in this study to directly evaluate the MPP of HCPV modules. The MPP of HCPV modules can be determined from an irradiated I-V characteristic curve, because there is a non-linear relationship between the temperature of a solar cell and solar radiation. Numerical simulations and field tests are conducted to examine the characteristics of HCPV modules during maximum output power tracking. The performance of the presented method is evaluated by examining the dependence of temperature and irradiation intensity on the MPP characteristics of HCPV modules. These results show that the presented method allows HCPV modules to achieve their maximum power and perform power tracking under various operation conditions. A 0.1% error is found between the estimated and the real maximum power point.Keywords: energy performance, high concentrated photovoltaic, maximum power point, p-n junction semiconductor
Procedia PDF Downloads 5841953 Rapid and Efficient Removal of Lead from Water Using Chitosan/Magnetite Nanoparticles
Authors: Othman M. Hakami, Abdul Jabbar Al-Rajab
Abstract:
Occurrence of heavy metals in water resources increased in the recent years albeit at low concentrations. Lead (PbII) is among the most important inorganic pollutants in ground and surface water. However, removal of this toxic metal efficiently from water is of public and scientific concern. In this study, we developed a rapid and efficient removal method of lead from water using chitosan/magnetite nanoparticles. A simple and effective process has been used to prepare chitosan/magnetite nanoparticles (NPs) (CS/Mag NPs) with effect on saturation magnetization value; the particles were strongly responsive to an external magnetic field making separation from solution possible in less than 2 minutes using a permanent magnet and the total Fe in solution was below the detection limit of ICP-OES (<0.19 mg L-1). The hydrodynamic particle size distribution increased from an average diameter of ~60 nm for Fe3O4 NPs to ~75 nm after chitosan coating. The feasibility of the prepared NPs for the adsorption and desorption of Pb(II) from water were evaluated using Chitosan/Magnetite NPs which showed a high removal efficiency for Pb(II) uptake, with 90% of Pb(II) removed during the first 5 minutes and equilibrium in less than 10 minutes. Maximum adsorption capacities for Pb(II) occurred at pH 6.0 and under room temperature were as high as 85.5 mg g-1, according to Langmuir isotherm model. Desorption of adsorbed Pb on CS/Mag NPs was evaluated using deionized water at different pH values ranged from 1 to 7 which was an effective eluent and did not result the destruction of NPs, then, they could subsequently be reused without any loss of their activity in further adsorption tests. Overall, our results showed the high efficiency of chitosan/magnetite nanoparticles (NPs) in lead removal from water in controlled conditions, and further studies should be realized in real field conditions.Keywords: chitosan, magnetite, water, treatment
Procedia PDF Downloads 4041952 Investigating Dynamic Transition Process of Issues Using Unstructured Text Analysis
Authors: Myungsu Lim, William Xiu Shun Wong, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Namgyu Kim
Abstract:
The amount of real-time data generated through various mass media has been increasing rapidly. In this study, we had performed topic analysis by using the unstructured text data that is distributed through news article. As one of the most prevalent applications of topic analysis, the issue tracking technique investigates the changes of the social issues that identified through topic analysis. Currently, traditional issue tracking is conducted by identifying the main topics of documents that cover an entire period at the same time and analyzing the occurrence of each topic by the period of occurrence. However, this traditional issue tracking approach has limitation that it cannot discover dynamic mutation process of complex social issues. The purpose of this study is to overcome the limitations of the existing issue tracking method. We first derived core issues of each period, and then discover the dynamic mutation process of various issues. In this study, we further analyze the mutation process from the perspective of the issues categories, in order to figure out the pattern of issue flow, including the frequency and reliability of the pattern. In other words, this study allows us to understand the components of the complex issues by tracking the dynamic history of issues. This methodology can facilitate a clearer understanding of complex social phenomena by providing mutation history and related category information of the phenomena.Keywords: Data Mining, Issue Tracking, Text Mining, topic Analysis, topic Detection, Trend Detection
Procedia PDF Downloads 4081951 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences
Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi
Abstract:
Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort
Procedia PDF Downloads 2131950 Health Exposure Assessment of Sulfur Loading Operation
Authors: Ayman M. Arfaj, Jose Lauro M. Llamas, Saleh Y Qahtani
Abstract:
Sulfur Loading Operation (SLO) is an operation that poses risk of exposure to toxic gases such as Hydrogen Sulfid and Sulfur Dioxide during molten sulfur loading operation. In this operation molten sulfur is loaded into a truck tanker in a liquid state and the temperature of the tanker must maintain liquid sulfur within a 43-degree range — between 266 degrees and 309 degrees Fahrenheit in order for safe loading and unloading to occur. Accordingly, in this study, the e potential risk of occupational exposure to the airborne toxic gases was assessed at three sulfur loading facilities. The concentrations of toxic airborne substances such as Hydrogen Sulfide (H2S) and Sulfur Dioxide (SO2), were monitored during operations at the different locations within the sulfur loading operation facilities. In addition to extensive real-time monitoring, over one hundred and fifty samples were collected and analysed at internationally accredited laboratories. The concentrations of H2S, and SO2 were all found to be well below their respective occupational exposure limits. Very low levels of H2S account for the odours observed intermittingly during mixing and application operations but do not pose a considerable health risk and hence these levels are considered a nuisance. These results were comparable to those reported internationally. Aside from observing the usual general safe work practices such as wearing safety glasses, there are no specific occupational health related concerns at the examined sulfur loading facilities.Keywords: exposure assessment, sulfur loading operation, health risk study, molten sulfur, toxic airborne substances, air contaminants monitoring
Procedia PDF Downloads 771949 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark
Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos
Abstract:
This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark
Procedia PDF Downloads 1201948 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach
Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed
Abstract:
Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model
Procedia PDF Downloads 4621947 Metaphorical Devices in Political Cartoons with Reference to Political Confrontation in Pakistan after Panama Leaks
Authors: Ayesha Ashfaq, Muhammad Ajmal Ashfaq
Abstract:
It has been assumed that metaphorical and symbolic contests are waged with metaphors, captions, and signs in political cartoons that play a significant role in image construction of political actors, situations or events in the political arena. This paper is an effort to explore the metaphorical devices in political cartoons related to the political confrontation in Pakistan between the ruling party Pakistan Muslim League Nawaz (PMLN) and opposition parties especially after Panama leaks. For this purpose, political cartoons sketched by five renowned political cartoonists on the basis of their belongings to the most highly circulated mainstream English newspapers of Pakistan and their professional experiences in their genre, were selected. The cartoons were analyzed through the Barthes’s model of Semiotics under the umbrella of the first level of agenda setting theory ‘framing’. It was observed that metaphorical devices in political cartoons are one of the key weapons of cartoonists’ armory. These devices are used to attack the candidates and contribute to the image and character building. It was found that all the selected political cartoonists used different forms of metaphors including situational metaphors and embodying metaphors. Not only the physical stature but also the debates and their activities were depicted metaphorically in the cartoons that create the scenario of comparison between the cartoons and their real political confrontation. It was examined that both forms of metaphors shed light on cartoonist’s perception and newspaper’s policy about political candidates, political parties and particular events. In addition, it was found that zoomorphic metaphors and metaphors of diminishments were also predominantly used to depict the conflict between two said political actors.Keywords: metaphor, Panama leaks, political cartoons, political communication
Procedia PDF Downloads 3071946 Development of GIS-Based Geotechnical Guidance Maps for Prediction of Soil Bearing Capacity
Authors: Q. Toufeeq, R. Kauser, U. R. Jamil, N. Sohaib
Abstract:
Foundation design of a structure needs soil investigation to avoid failures due to settlements. This soil investigation is expensive and time-consuming. Developments of new residential societies involve huge leveling of large sites that is accompanied by heavy land filling. Poor practices of land fill for deep depths cause differential settlements and consolidations of underneath soil that sometimes result in the collapse of structures. The extent of filling remains unknown to the individual developer unless soil investigation is carried out. Soil investigation cannot be performed on each available site due to involved costs. However, fair estimate of bearing capacity can be made if such tests are already done in the surrounding areas. The geotechnical guidance maps can provide a fair assessment of soil properties. Previously, GIS-based approaches have been used to develop maps using extrapolation and interpolations techniques for bearing capacities, underground recharge, soil classification, geological hazards, landslide hazards, socio-economic, and soil liquefaction mapping. Standard penetration test (SPT) data of surrounding sites were already available. Google Earth is used for digitization of collected data. Few points were considered for data calibration and validation. Resultant Geographic information system (GIS)-based guidance maps are helpful to anticipate the bearing capacity in the real estate industry.Keywords: bearing capacity, soil classification, geographical information system, inverse distance weighted, radial basis function
Procedia PDF Downloads 1351945 Impact of the Electricity Market Prices during the COVID-19 Pandemic on Energy Storage Operation
Authors: Marin Mandić, Elis Sutlović, Tonći Modrić, Luka Stanić
Abstract:
With the restructuring and deregulation of the power system, storage owners, generation companies or private producers can offer their multiple services on various power markets and earn income in different types of markets, such as the day-ahead, real-time, ancillary services market, etc. During the COVID-19 pandemic, electricity prices, as well as ancillary services prices, increased significantly. The optimization of the energy storage operation was performed using a suitable model for simulating the operation of a pumped storage hydropower plant under market conditions. The objective function maximizes the income earned through energy arbitration, regulation-up, regulation-down and spinning reserve services. The optimization technique used for solving the objective function is mixed integer linear programming (MILP). In numerical examples, the pumped storage hydropower plant operation has been optimized considering the already achieved hourly electricity market prices from Nord Pool for the pre-pandemic (2019) and the pandemic (2020 and 2021) years. The impact of the electricity market prices during the COVID-19 pandemic on energy storage operation is shown through the analysis of income, operating hours, reserved capacity and consumed energy for each service. The results indicate the role of energy storage during a significant fluctuation in electricity and services prices.Keywords: electrical market prices, electricity market, energy storage optimization, mixed integer linear programming (MILP) optimization
Procedia PDF Downloads 1741944 Decoding WallStreetBets: The Impact of Daily Disagreements on Trading Volumes
Authors: F. Ghandehari, H. Lu, L. El-Jahel, D. Jayasuriya
Abstract:
Disagreement among investors is a fundamental aspect of financial markets, significantly influencing market dynamics. Measuring this disagreement has traditionally posed challenges, often relying on proxies like analyst forecast dispersion, which are limited by biases and infrequent updates. Recent movements in social media indicate that retail investors actively seek financial advice online and can influence the stock market. The evolution of the investing landscape, particularly the rise of social media as a hub for financial advice, provides an alternative avenue for real-time measurement of investor sentiment and disagreement. Platforms like Reddit offer rich, community-driven discussions that reflect genuine investor opinions. This research explores how social media empowers retail investors and the potential of leveraging textual analysis of social media content to capture daily fluctuations in investor disagreement. This study investigates the relationship between daily investor disagreement and trading volume, focusing on the role of social media platforms in shaping market dynamics, specifically using data from WallStreetBets (WSB) on Reddit. This paper uses data from 2020 to 2023 from WSB and analyses 4,896 firms with enough social media activity in WSB to define stock-day level disagreement measures. Consistent with traditional theories that disagreement induces trading volume, the results show significant evidence supporting this claim through different disagreement measures derived from WSB discussions.Keywords: disagreement, retail investor, social finance, social media
Procedia PDF Downloads 391943 Mining Riding Patterns in Bike-Sharing System Connecting with Public Transportation
Authors: Chong Zhang, Guoming Tang, Bin Ge, Jiuyang Tang
Abstract:
With the fast growing road traffic and increasingly severe traffic congestion, more and more citizens choose to use the public transportation for daily travelling. Meanwhile, the shared bike provides a convenient option for the first and last mile to the public transit. As of 2016, over one thousand cities around the world have deployed the bike-sharing system. The combination of these two transportations have stimulated the development of each other and made significant contribution to the reduction of carbon footprint. A lot of work has been done on mining the riding behaviors in various bike-sharing systems. Most of them, however, treated the bike-sharing system as an isolated system and thus their results provide little reference for the public transit construction and optimization. In this work, we treat the bike-sharing and public transit as a whole and investigate the customers’ bike-and-ride behaviors. Specifically, we develop a spatio-temporal traffic delivery model to study the riding patterns between the two transportation systems and explore the traffic characteristics (e.g., distributions of customer arrival/departure and traffic peak hours) from the time and space dimensions. During the model construction and evaluation, we make use of large open datasets from real-world bike-sharing systems (the CitiBike in New York, GoBike in San Francisco and BIXI in Montreal) along with corresponding public transit information. The developed two-dimension traffic model, as well as the mined bike-and-ride behaviors, can provide great help to the deployment of next-generation intelligent transportation systems.Keywords: riding pattern mining, bike-sharing system, public transportation, bike-and-ride behavior
Procedia PDF Downloads 781