Search results for: Consultative Committee for Space Data Systems (CCSDS) standards
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34322

Search results for: Consultative Committee for Space Data Systems (CCSDS) standards

30212 Identifying Environmental Adaptive Genetic Loci in Caloteropis Procera (Estabragh): Population Genetics and Landscape Genetic Analyses

Authors: Masoud Sheidaei, Mohammad-Reza Kordasti, Fahimeh Koohdar

Abstract:

Calotropis procera (Aiton) W.T.Aiton, (Apocynaceae), is an economically and medicinally important plant species which is an evergreen, perennial shrub growing in arid and semi-arid climates, and can tolerate very low annual rainfall (150 mm) and a dry season. The plant can also tolerate temperature ran off 20 to30°C and is not frost tolerant. This plant species prefers free-draining sandy soils but can grow also in alkaline and saline soils.It is found at a range of altitudes from exposed coastal sites to medium elevations up to 1300 m. Due to morpho-physiological adaptations of C. procera and its ability to tolerate various abiotic stresses. This taxa can compete with desirable pasture species and forms dense thickets that interfere with stock management, particularly mustering activities. Caloteropis procera grows only in southern part of Iran where in comprises a limited number of geographical populations. We used different population genetics and r landscape analysis to produce data on geographical populations of C. procera based on molecular genetic study using SCoT molecular markers. First, we used spatial principal components (sPCA), as it can analyze data in a reduced space and can be used for co-dominant markers as well as presence / absence data as is the case in SCoT molecular markers. This method also carries out Moran I and Mantel tests to reveal spatial autocorrelation and test for the occurrence of Isolation by distance (IBD). We also performed Random Forest analysis to identify the importance of spatial and geographical variables on genetic diversity. Moreover, we used both RDA (Redundency analysis), and LFMM (Latent factor mixed model), to identify the genetic loci significantly associated with geographical variables. A niche modellng analysis was carried our to predict present potential area for distribution of these plants and also the area present by the year 2050. The results obtained will be discussed in this paper.

Keywords: population genetics, landscape genetic, Calotreropis procera, niche modeling, SCoT markers

Procedia PDF Downloads 77
30211 Expert-Driving-Criteria Based on Fuzzy Logic Approach for Intelligent Driving Diagnosis

Authors: Andrés C. Cuervo Pinilla, Christian G. Quintero M., Chinthaka Premachandra

Abstract:

This paper considers people’s driving skills diagnosis under real driving conditions. In that sense, this research presents an approach that uses GPS signals which have a direct correlation with driving maneuvers. Besides, it is presented a novel expert-driving-criteria approximation using fuzzy logic which seeks to analyze GPS signals in order to issue an intelligent driving diagnosis. Based on above, this works presents in the first section the intelligent driving diagnosis system approach in terms of its own characteristics properties, explaining in detail significant considerations about how an expert-driving-criteria approximation must be developed. In the next section, the implementation of our developed system based on the proposed fuzzy logic approach is explained. Here, a proposed set of rules which corresponds to a quantitative abstraction of some traffics laws and driving secure techniques seeking to approach an expert-driving- criteria approximation is presented. Experimental testing has been performed in real driving conditions. The testing results show that the intelligent driving diagnosis system qualifies driver’s performance quantitatively with a high degree of reliability.

Keywords: driver support systems, intelligent transportation systems, fuzzy logic, real time data processing

Procedia PDF Downloads 493
30210 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 409
30209 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 47
30208 An Investigation of the Determinants of Discount Rate Manipulation in Swedish and Finnish Listed Companies

Authors: Fredrik Hartwig, Peter Lindberg

Abstract:

In 2004, the International Accounting Standards Board (IASB) issued new accounting standards for impairment testing of goodwill. IFRS 3 Business Combinations and IAS 36 Impairment of Assets prohibited amortization of acquired goodwill and instead required companies to test goodwill for impairment annually or more often if necessary. The goodwill impairment test is based on management’s judgement and estimations, making the impairment-only-approach subjective and unreliable. Management can use the discretion opportunistically by managing goodwill impairments. The IASB’s remedy to the reliability problem has been to demand transparent financial reports. IAS 36 paragraph 134 requires detailed disclosures regarding the impairment test in order to make potentially unreasonable assumptions and estimations visible. The disclosure requirements should thus (in theory) make it more difficult for management to ‘choose’ assumptions and estimations that suit an agenda. Whether the requirement to disclose detailed disclosures regarding the impairment test leads to less opportunism is however an empirical question. This work analyses whether one of the required disclosures in IAS 36 paragraph 134, the reported discount rate, differs from an independently estimated risk-adjusted discount rate. Estimates of discount rates that are either lower or higher than the independently estimated discount rate are here defined as opportunism. In the former case - i.e. when the reported discount rate is lower - the objective may be to avoid profit reducing impairment charges. In the latter case - i.e. when the reported discount rate is higher - the objective may be to reduce profits or take ‘big baths’. This paper differs in one important respect from previous similar studies, the majority of which are based on purely descriptive statistics; we use multivariate regression analysis to analyze what factors affect deviations between disclosed discount rates and independently estimated discount rates. The sample consists of Swedish and Finnish listed companies. Swedish and Finnish listed companies are analysed since the accounting oversight bodies differ between the two countries. The results show that discount rate deviations in Swedish and Finnish listed companies are significantly related to accounting oversight, size and industry but not financial risk, business risk and goodwill intensity.

Keywords: discount rate, manipulation, goodwill impairment test, disclosures

Procedia PDF Downloads 110
30207 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis

Authors: Amir Hajian, Sepehr Damavandinejadmonfared

Abstract:

In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.

Keywords: biometrics, finger vein recognition, principal component analysis (PCA), kernel principal component analysis (KPCA)

Procedia PDF Downloads 347
30206 Improving Activity Recognition Classification of Repetitious Beginner Swimming Using a 2-Step Peak/Valley Segmentation Method with Smoothing and Resampling for Machine Learning

Authors: Larry Powell, Seth Polsley, Drew Casey, Tracy Hammond

Abstract:

Human activity recognition (HAR) systems have shown positive performance when recognizing repetitive activities like walking, running, and sleeping. Water-based activities are a reasonably new area for activity recognition. However, water-based activity recognition has largely focused on supporting the elite and competitive swimming population, which already has amazing coordination and proper form. Beginner swimmers are not perfect, and activity recognition needs to support the individual motions to help beginners. Activity recognition algorithms are traditionally built around short segments of timed sensor data. Using a time window input can cause performance issues in the machine learning model. The window’s size can be too small or large, requiring careful tuning and precise data segmentation. In this work, we present a method that uses a time window as the initial segmentation, then separates the data based on the change in the sensor value. Our system uses a multi-phase segmentation method that pulls all peaks and valleys for each axis of an accelerometer placed on the swimmer’s lower back. This results in high recognition performance using leave-one-subject-out validation on our study with 20 beginner swimmers, with our model optimized from our final dataset resulting in an F-Score of 0.95.

Keywords: time window, peak/valley segmentation, feature extraction, beginner swimming, activity recognition

Procedia PDF Downloads 106
30205 Tracking Performance Evaluation of Robust Back-Stepping Control Design for a ‎Nonlinear Electro-Hydraulic Servo System

Authors: Maria Ahmadnezhad, Mohammad Reza Soltanpour

Abstract:

Electrohydraulic servo systems have been used in industry in a wide number of applications. Its dynamics ‎are highly nonlinear and also have large extent of model uncertainties and external disturbances. In this ‎thesis, a robust back-stepping control (RBSC) scheme is proposed to overcome the problem of ‎disturbances and system uncertainties effectively and to improve the tracking performance of EHS ‎systems. In order to implement the proposed control scheme, the system uncertainties in EHS systems ‎are considered as total leakage coefficient and effective oil volume. In addition, in order to obtain the ‎virtual controls for stabilizing system, the update rule for the system uncertainty term is induced by the ‎Lyapunov control function (LCF). To verify the performance and robustness of the proposed control ‎system, computer simulation of the proposed control system using Matlab/Simulink Software is ‎executed. From the computer simulation, it was found that the RBSC system produces the desired ‎tracking performance and has robustness to the disturbances and system uncertainties of EHS systems.‎

Keywords: electro hydraulic servo system, back-stepping control, robust back-stepping control, Lyapunov redesign

Procedia PDF Downloads 279
30204 An Analysis System for Integrating High-Throughput Transcript Abundance Data with Metabolic Pathways in Green Algae

Authors: Han-Qin Zheng, Yi-Fan Chiang-Hsieh, Chia-Hung Chien, Wen-Chi Chang

Abstract:

As the most important non-vascular plants, algae have many research applications, including high species diversity, biofuel sources, adsorption of heavy metals and, following processing, health supplements. With the increasing availability of next-generation sequencing (NGS) data for algae genomes and transcriptomes, an integrated resource for retrieving gene expression data and metabolic pathway is essential for functional analysis and systems biology in algae. However, gene expression profiles and biological pathways are displayed separately in current resources, and making it impossible to search current databases directly to identify the cellular response mechanisms. Therefore, this work develops a novel AlgaePath database to retrieve gene expression profiles efficiently under various conditions in numerous metabolic pathways. AlgaePath, a web-based database, integrates gene information, biological pathways, and next-generation sequencing (NGS) datasets in Chlamydomonasreinhardtii and Neodesmus sp. UTEX 2219-4. Users can identify gene expression profiles and pathway information by using five query pages (i.e. Gene Search, Pathway Search, Differentially Expressed Genes (DEGs) Search, Gene Group Analysis, and Co-Expression Analysis). The gene expression data of 45 and 4 samples can be obtained directly on pathway maps in C. reinhardtii and Neodesmus sp. UTEX 2219-4, respectively. Genes that are differentially expressed between two conditions can be identified in Folds Search. Furthermore, the Gene Group Analysis of AlgaePath includes pathway enrichment analysis, and can easily compare the gene expression profiles of functionally related genes in a map. Finally, Co-Expression Analysis provides co-expressed transcripts of a target gene. The analysis results provide a valuable reference for designing further experiments and elucidating critical mechanisms from high-throughput data. More than an effective interface to clarify the transcript response mechanisms in different metabolic pathways under various conditions, AlgaePath is also a data mining system to identify critical mechanisms based on high-throughput sequencing.

Keywords: next-generation sequencing (NGS), algae, transcriptome, metabolic pathway, co-expression

Procedia PDF Downloads 392
30203 Control of Base Isolated Benchmark using Combined Control Strategy with Fuzzy Algorithm Subjected to Near-Field Earthquakes

Authors: Hashem Shariatmadar, Mozhgansadat Momtazdargahi

Abstract:

The purpose of control structure against earthquake is to dissipate earthquake input energy to the structure and reduce the plastic deformation of structural members. There are different methods for control structure against earthquake to reduce the structure response that they are active, semi-active, inactive and hybrid. In this paper two different combined control systems are used first system comprises base isolator and multi tuned mass dampers (BI & MTMD) and another combination is hybrid base isolator and multi tuned mass dampers (HBI & MTMD) for controlling an eight story isolated benchmark steel structure. Active control force of hybrid isolator is estimated by fuzzy logic algorithms. The influences of the combined systems on the responses of the benchmark structure under the two near-field earthquake (Newhall & Elcentro) are evaluated by nonlinear dynamic time history analysis. Applications of combined control systems consisting of passive or active systems installed in parallel to base-isolation bearings have the capability of reducing response quantities of base-isolated (relative and absolute displacement) structures significantly. Therefore in design and control of irregular isolated structures using the proposed control systems, structural demands (relative and absolute displacement and etc.) in each direction must be considered separately.

Keywords: base-isolated benchmark structure, multi-tuned mass dampers, hybrid isolators, near-field earthquake, fuzzy algorithm

Procedia PDF Downloads 282
30202 Technological and Economic Investigation of Concentrated Photovoltaic and Thermal Systems: A Case Study of Iran

Authors: Moloud Torkandam

Abstract:

Any cities must be designed and built in a way that minimizes their need for fossil fuel. Undoubtedly, the necessity of accepting this principle in the previous eras is undeniable with respect to the mode of constructions. Perhaps only due to the great diversity of materials and new technologies in the contemporary era, such a principle in buildings has been forgotten. The question of optimizing energy consumption in buildings has attracted a great deal of attention in many countries and, in this way, they have been able to cut down the consumption of energy up to 30 percent. The energy consumption is remarkably higher than global standards in our country, and the most important reason is the undesirable state of buildings from the standpoint of energy consumption. In addition to providing the means to protect the natural and fuel resources for the future generations, reducing the use of fossil energies may also bring about desirable outcomes such as the decrease in greenhouse gases (whose emissions cause global warming, the melting of polar ice, the rise in sea level and the climatic changes of the planet earth), the decrease in the destructive effects of contamination in residential complexes and especially urban environments and preparation for national self-sufficiency and the country’s independence and preserving national capitals. This research realize that in this modern day and age, living sustainably is a pre-requisite for ensuring a bright future and high quality of life. In acquiring this living standard, we will maintain the functions and ability of our environment to serve and sustain our livelihoods. Electricity is now an integral part of modern life, a basic necessity. In the provision of electricity, we are committed to respecting the environment by reducing the use of fossil fuels through the use of proven technologies that use local renewable and natural resources as its energy source. As far as this research concerned it is completely necessary to work on different type of energy producing such as solar and CPVT system.

Keywords: energy, photovoltaic, termal system, solar energy, CPVT

Procedia PDF Downloads 65
30201 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 356
30200 Simulation of a Control System for an Adaptive Suspension System for Passenger Vehicles

Authors: S. Gokul Prassad, S. Aakash, K. Malar Mohan

Abstract:

In the process to cope with the challenges faced by the automobile industry in providing ride comfort, the electronics and control systems play a vital role. The control systems in an automobile monitor various parameters, controls the performances of the systems, thereby providing better handling characteristics. The automobile suspension system is one of the main systems that ensure the safety, stability and comfort of the passengers. The system is solely responsible for the isolation of the entire automobile from harmful road vibrations. Thus, integration of the control systems in the automobile suspension system would enhance its performance. The diverse road conditions of India demand the need of an efficient suspension system which can provide optimum ride comfort in all road conditions. For any passenger vehicle, the design of the suspension system plays a very important role in assuring the ride comfort and handling characteristics. In recent years, the air suspension system is preferred over the conventional suspension systems to ensure ride comfort. In this article, the ride comfort of the adaptive suspension system is compared with that of the passive suspension system. The schema is created in MATLAB/Simulink environment. The system is controlled by a proportional integral differential controller. Tuning of the controller was done with the Particle Swarm Optimization (PSO) algorithm, since it suited the problem best. Ziegler-Nichols and Modified Ziegler-Nichols tuning methods were also tried and compared. Both the static responses and dynamic responses of the systems were calculated. Various random road profiles as per ISO 8608 standard are modelled in the MATLAB environment and their responses plotted. Open-loop and closed loop responses of the random roads, various bumps and pot holes are also plotted. The simulation results of the proposed design are compared with the available passive suspension system. The obtained results show that the proposed adaptive suspension system is efficient in controlling the maximum over shoot and the settling time of the system is reduced enormously.

Keywords: automobile suspension, MATLAB, control system, PID, PSO

Procedia PDF Downloads 279
30199 Visualisation in Health Communication: Taking Weibo Interaction in COVD19 as the Example

Authors: Zicheng Zhang, Linli Zhang

Abstract:

As China's biggest social media platform, Weibo has taken on essential health communication responsibilities during the pandemic. This research takes 105 posters in 15 health-related official Weibo accounts as the analysis objects to explore COVID19 health information communication and visualisation. First, the interaction between the audiences and Weibo, including forwarding, comments, and likes, is statistically analysed. The comments about the information design are extracted manually, and then the sentiment analysis is carried out to verdict audiences' views about the poster's design. The forwarding and comments are quantified as the attention index for a reference to the degree of likes. In addition, this study also designed an evaluation scale based on the standards of Health Literacy Resource by the Centers for Medicare& Medicaid Services (US). Then designers scored all selected posters one by one. Finally, combining the data of the two parts, concluded that: 1. To a certain extent, people think that the posters do not deliver substantive and practical information; 2. Non-knowledge posters(i.e., cartoon posters) gained more Forwarding and Likes, such as Go, Wuhan poster; 3. The analysis of COVID posters is still mainly picture-oriented, mainly about encouraging people to overcome difficulties; 4. Posters for pandemic prevention usually contain more text and fewer illustrations and do not clearly show cultural differences. In conclusion, health communication usually involves a lot of professional knowledge, so visualising that knowledge in an accessible way for the general public is challenging. The relevant posters still have the problems of lack of effective communication, superficial design, and insufficient content accessibility.

Keywords: weibo, visualisation, covid posters, poster design

Procedia PDF Downloads 111
30198 Fractal Analysis of Some Bifurcations of Discrete Dynamical Systems in Higher Dimensions

Authors: Lana Horvat Dmitrović

Abstract:

The main purpose of this paper is to study the box dimension as fractal property of bifurcations of discrete dynamical systems in higher dimensions. The paper contains the fractal analysis of the orbits near the hyperbolic and non-hyperbolic fixed points in discrete dynamical systems. It is already known that in one-dimensional case the orbit near the hyperbolic fixed point has the box dimension equal to zero. On the other hand, the orbit near the non-hyperbolic fixed point has strictly positive box dimension which is connected to the non-degeneracy condition of certain bifurcation. One of the main results in this paper is the generalisation of results about box dimension near the hyperbolic and non-hyperbolic fixed points to higher dimensions. In the process of determining box dimension, the restriction of systems to stable, unstable and center manifolds, Lipschitz property of box dimension and the notion of projective box dimension are used. The analysis of the bifurcations in higher dimensions with one multiplier on the unit circle is done by using the normal forms on one-dimensional center manifolds. This specific change in box dimension of an orbit at the moment of bifurcation has already been explored for some bifurcations in one and two dimensions. It was shown that specific values of box dimension are connected to appropriate bifurcations such as fold, flip, cusp or Neimark-Sacker bifurcation. This paper further explores this connection of box dimension as fractal property to some specific bifurcations in higher dimensions, such as fold-flip and flip-Neimark-Sacker. Furthermore, the application of the results to the unit time map of continuous dynamical system near hyperbolic and non-hyperbolic singularities is presented. In that way, box dimensions which are specific for certain bifurcations of continuous systems can be obtained. The approach to bifurcation analysis by using the box dimension as specific fractal property of orbits can lead to better understanding of bifurcation phenomenon. It could also be useful in detecting the existence or nonexistence of bifurcations of discrete and continuous dynamical systems.

Keywords: bifurcation, box dimension, invariant manifold, orbit near fixed point

Procedia PDF Downloads 233
30197 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 235
30196 Behaviour of Non-local Correlations and Quantum Information Theoretic Measures in Frustrated Molecular Wheels

Authors: Amit Tribedi

Abstract:

Genuine Quantumness present in Quantum Systems is the resource for implementing Quantum Information and Computation Protocols which can outperform the classical counterparts. These Quantumness measures encompass non-local ones known as quantum entanglement (QE) and quantum information theoretic (QIT) ones, e.g. Quantum Discord (QD). In this paper, some well-known measures of QE and QD in some wheel-like frustrated molecular magnetic systems have been studied. One of the systems has already been synthesized using coordination chemistry, and the other is hypothetical, where the dominant interaction is the spin-spin exchange interaction. Exact analytical methods and exact numerical diagonalization methods have been used. Some counter-intuitive non-trivial features, like non-monotonicity of quantum correlations with temperature, persistence of multipartite entanglement over bipartite ones etc. indicated by the behaviour of the correlations and the QIT measures have been found. The measures, being operational ones, can be used to realize the resource of Quantumness in experiments.

Keywords: 0D Magnets, discord, entanglement, frustration

Procedia PDF Downloads 211
30195 Feasibility Study of a Solar Solid Desiccant Cooling System in Algerian Areas

Authors: N. Hatraf, l. Merabeti, M. Abbas

Abstract:

The interest in air conditioning using renewable energies is increasing. The Thermal energy produced from the solar energy can be transformed to useful cooling and heating through the thermo chemical or thermo physical processes by using thermally activated energy conversion system. Solid desiccant conditioning systems can represent a reliable alternative solution compared with other thermal cooling technologies. Their basic characteristics refer to the capability to regulate both temperature and humidity of the conditioned space in one side and to its potential in electrical energy saving in the other side. The ambient air contains so much water that very high dehumidification rates are required. For a continuous dehumidification of the process air the water adsorbed on the desiccant material has to be removed, which is done by allowing hot air to flow through the desiccant material (regeneration). Basically, solid desiccant cooling system transfers moisture from the inlet air to the silica gel by using two processes: absorption process and the regeneration process; The silica gel in the desiccant wheel which is the most important device in the system absorbs the moisture from the incoming air to the desiccant material in this case the silica gel, then it changes the heat with an rotary heat exchanger, after that the air passes through an humidifier to have the humidity required before entering to the local. The main aim of this paper is to study how the dehumidification rate, the generation temperature and many other factors influence the efficiency of a solid desiccant system by using TRNSYS software.

Keywords: desiccation, dehumidification, TRNSYS, efficiency

Procedia PDF Downloads 404
30194 Transportation and Urban Land-Use System for the Sustainability of Cities, a Case Study of Muscat

Authors: Bader Eddin Al Asali, N. Srinivasa Reddy

Abstract:

Cities are dynamic in nature and are characterized by concentration of people, infrastructure, services and markets, which offer opportunities for production and consumption. Often growth and development in urban areas is not systematic, and is directed by number of factors like natural growth, land prices, housing availability, job locations-the central business district (CBD’s), transportation routes, distribution of resources, geographical boundaries, administrative policies, etc. One sided spatial and geographical development in cities leads to the unequal spatial distribution of population and jobs, resulting in high transportation activity. City development can be measured by the parameters such as urban size, urban form, urban shape, and urban structure. Urban Size is the city size and defined by the population of the city, and urban form is the location and size of the economic activity (CBD) over the geographical space. Urban shape is the geometrical shape of the city over which the distribution of population and economic activity occupied. And Urban Structure is the transport network within which the population and activity centers are connected by hierarchy of roads. Among the urban land-use systems transportation plays significant role and is one of the largest energy consuming sector. Transportation interaction among the land uses is measured in Passenger-Km and mean trip length, and is often used as a proxy for measurement of energy consumption in transportation sector. Among the trips generated in cities, work trips constitute more than 70 percent. Work trips are originated from the place of residence and destination to the place of employment. To understand the role of urban parameters on transportation interaction, theoretical cities of different size and urban specifications are generated through building block exercise using a specially developed interactive C++ programme and land use transportation modeling is carried. The land-use transportation modeling exercise helps in understanding the role of urban parameters and also to classify the cities for their urban form, structure, and shape. Muscat the capital city of Oman underwent rapid urbanization over the last four decades is taken as a case study for its classification. Also, a pilot survey is carried to capture urban travel characteristics. Analysis of land-use transportation modeling with field data classified Muscat as a linear city with polycentric CBD. Conclusions are drawn suggestion are given for policy making for the sustainability of Muscat City.

Keywords: land-use transportation, transportation modeling urban form, urban structure, urban rule parameters

Procedia PDF Downloads 254
30193 Characterization of the Groundwater Aquifers at El Sadat City by Joint Inversion of VES and TEM Data

Authors: Usama Massoud, Abeer A. Kenawy, El-Said A. Ragab, Abbas M. Abbas, Heba M. El-Kosery

Abstract:

Vertical Electrical Sounding (VES) and Transient Electro Magnetic (TEM) survey have been applied for characterizing the groundwater aquifers at El Sadat industrial area. El-Sadat city is one of the most important industrial cities in Egypt. It has been constructed more than three decades ago at about 80 km northwest of Cairo along the Cairo–Alexandria desert road. Groundwater is the main source of water supplies required for domestic, municipal, and industrial activities in this area due to the lack of surface water sources. So, it is important to maintain this vital resource in order to sustain the development plans of this city. In this study, VES and TEM data were identically measured at 24 stations along three profiles trending NE–SW with the elongation of the study area. The measuring points were arranged in a grid like pattern with both inter-station spacing and line–line distance of about 2 km. After performing the necessary processing steps, the VES and TEM data sets were inverted individually to multi-layer models, followed by a joint inversion of both data sets. Joint inversion process has succeeded to overcome the model-equivalence problem encountered in the inversion of individual data set. Then, the joint models were used for the construction of a number of cross sections and contour maps showing the lateral and vertical distribution of the geo-electrical parameters in the subsurface medium. Interpretation of the obtained results and correlation with the available geological and hydrogeological information revealed TWO aquifer systems in the area. The shallow Pleistocene aquifer consists of sand and gravel saturated with fresh water and exhibits large thickness exceeding 200 m. The deep Pliocene aquifer is composed of clay and sand and shows low resistivity values. The water bearing layer of the Pleistocene aquifer and the upper surface of Pliocene aquifer are continuous and no structural features have cut this continuity through the investigated area.

Keywords: El Sadat city, joint inversion, VES, TEM

Procedia PDF Downloads 354
30192 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 242
30191 A Collaborative Learning Model in Engineering Science Based on a Cyber-Physical Production Line

Authors: Yosr Ghozzi

Abstract:

The Cyber-Physical Systems terminology has been well received by the industrial community and specifically appropriated in educational settings. Indeed, our latest educational activities are based on the development of experimental platforms on an industrial scale. In fact, we built a collaborative learning model because of an international market study that led us to place ourselves at the heart of this technology. To align with these findings, a competency-based approach study was conducted, and program content was revised by reflecting the projectbased approach. Thus, this article deals with the development of educational devices according to a generated curriculum and specific educational activities while respecting the repository of skills adopted from what constitutes the educational cyber-physical production systems and the laboratories that are compliant and adapted to them. The implementation of these platforms was systematically carried out in the school's workshops spaces. The objective has been twofold, both research and teaching for the students in mechatronics and logistics of the electromechanical department. We act as trainers and industrial experts to involve students in the implementation of possible extension systems around multidisciplinary projects and reconnect with industrial projects for better professional integration.

Keywords: education 4.0, competency-based learning, teaching factory, project-based learning, cyber-physical systems, industry 4.0

Procedia PDF Downloads 78
30190 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 468
30189 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 539
30188 A Qualitative Research of Online Fraud Decision-Making Process

Authors: Semire Yekta

Abstract:

Many online retailers set up manual review teams to overcome the limitations of automated online fraud detection systems. This study critically examines the strategies they adapt in their decision-making process to set apart fraudulent individuals from non-fraudulent online shoppers. The study uses a mix method research approach. 32 in-depth interviews have been conducted alongside with participant observation and auto-ethnography. The study found out that all steps of the decision-making process are significantly affected by a level of subjectivity, personal understandings of online fraud, preferences and judgments and not necessarily by objectively identifiable facts. Rather clearly knowing who the fraudulent individuals are, the team members have to predict whether they think the customer might be a fraudster. Common strategies used are relying on the classification and fraud scorings in the automated fraud detection systems, weighing up arguments for and against the customer and making a decision, using cancellation to test customers’ reaction and making use of personal experiences and “the sixth sense”. The interaction in the team also plays a significant role given that some decisions turn into a group discussion. While customer data represent the basis for the decision-making, fraud management teams frequently make use of Google search and Google Maps to find out additional information about the customer and verify whether the customer is the person they claim to be. While this, on the one hand, raises ethical concerns, on the other hand, Google Street View on the address and area of the customer puts customers living in less privileged housing and areas at a higher risk of being classified as fraudsters. Phone validation is used as a final measurement to make decisions for or against the customer when previous strategies and Google Search do not suffice. However, phone validation is also characterized by individuals’ subjectivity, personal views and judgment on customer’s reaction on the phone that results in a final classification as genuine or fraudulent.

Keywords: online fraud, data mining, manual review, social construction

Procedia PDF Downloads 331
30187 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 140
30186 An Analysis of Innovative Cloud Model as Bridging the Gap between Physical and Virtualized Business Environments: The Customer Perspective

Authors: Asim Majeed, Rehan Bhana, Mak Sharma, Rebecca Goode, Nizam Bolia, Mike Lloyd-Williams

Abstract:

This study aims to investigate and explore the underlying causes of security concerns of customers emerged when WHSmith transformed its physical system to virtualized business model through NetSuite. NetSuite is essentially fully integrated software which helps transforming the physical system to virtualized business model. Modern organisations are moving away from traditional business models to cloud based models and consequently it is expected to have a better, secure and innovative environment for customers. The vital issue of the modern age race is the security when transforming virtualized through cloud based models and designers of interactive systems often misunderstand privacy and even often ignore it, thus causing concerns for users. The content analysis approach is being used to collect the qualitative data from 120 online bloggers including TRUSTPILOT. The results and finding provide useful new insights into the nature and form of security concerns of online users after they have used the WHSmith services offered online through their website. Findings have theoretical as well as practical implications for the successful adoption of cloud computing Business-to-Business model and similar systems.

Keywords: innovation, virtualization, cloud computing, organizational flexibility

Procedia PDF Downloads 369
30185 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 39
30184 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals

Authors: Bahareh Ansari

Abstract:

Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.

Keywords: best practices, data visualization, literature review, open government data

Procedia PDF Downloads 90
30183 Human Capital Development: A Pivotal for Sustainable Development in Developing Countries

Authors: Yusuf Ismaila

Abstract:

The developing countries are characterized by inefficient production systems and unequal distribution of wealth. Developing countries are largely populated, yet under developed. This can be attributed partly to the unplanned efforts towards the development of human capital through education and training. In the developed nations a huge attention is accorded to indices such as life expectancy, literacy, infant mortality, education, and the efficient delivery of social services. This is the reason why many developing countries have been scored low by the United Nations in terms of its human development indicators. The population growth continued to expand far beyond the rate of economic growth, a situation that gave rise to increasing poverty. This paper examines the effect of selected human development indicators on the economic development. Thus human capital development is one of the fundamental solutions to enter the international arena. Both quantitative and qualitative analyses were used to demonstrate the effect of selected human capital indices and related literatures were also reviewed for exposition of the human capital concept. It was found that there are no conscious efforts in human capital planning. This has therefore resulted to continuing dwindling of production system and poverty. Recommendations made to redress the situation include that human capital development should be planned and adequately funded in line with the needs of the economy and by applying international standards. Specifically, developing countries must invest necessary resources in developing human capital which tend to have a great impact on sustainable development. Information about the labour market should improve while government policy should favour labour mobility. HCD strategy must focus on improving the skills of the workforce, reducing the cost of doing business and making available the resources business needs to compete and thrive in a fast globalizing economy. There should be regular interaction of planners, employers and builders of human capital to facilitate the process of meaningful national development.

Keywords: economic development, human capital, economic growth, developing countries

Procedia PDF Downloads 412