Search results for: elliptic curve digital signature algorithm
2551 Road Vehicle Recognition Using Magnetic Sensing Feature Extraction and Classification
Authors: Xiao Chen, Xiaoying Kong, Min Xu
Abstract:
This paper presents a road vehicle detection approach for the intelligent transportation system. This approach mainly uses low-cost magnetic sensor and associated data collection system to collect magnetic signals. This system can measure the magnetic field changing, and it also can detect and count vehicles. We extend Mel Frequency Cepstral Coefficients to analyze vehicle magnetic signals. Vehicle type features are extracted using representation of cepstrum, frame energy, and gap cepstrum of magnetic signals. We design a 2-dimensional map algorithm using Vector Quantization to classify vehicle magnetic features to four typical types of vehicles in Australian suburbs: sedan, VAN, truck, and bus. Experiments results show that our approach achieves a high level of accuracy for vehicle detection and classification.Keywords: vehicle classification, signal processing, road traffic model, magnetic sensing
Procedia PDF Downloads 3202550 Urban Innovations: Towards a Comprehensive and Sustainable City Development
Authors: Sarang Yeola
Abstract:
A smart city can be defined as a city that uses Information and Communication Technologies (ICT) to enhance its sustainability, workability and livability. It can be viewed as a ‘System of Systems’. We propose decentralization of power and centralization of system. We are presenting a bird's eye view of the system as a whole. The holistic view includes the entirety of human activity in an area including city governments, schools, hospitals, infrastructure, resources, business and people. The main objective for development of Nashik as a smart city is to identify the flaws of the existing systems, eliminate them and come up with innovative and feasible solutions for the betterment of masses. The Make in India is a visionary proposal for FDI in India. It should be managed that the campaign and the industrial estates work in synchronization for boosting the setup of new industrial units in and around Nashik. A smart grid is a modernized electrical grid that uses analog or digital information and communications technology to gather and act on information. We have identified major domains for making Nashik a smart city by surveying the existing infrastructure, challenges and problems faced and the proposed solutions through innovative ideas.Keywords: transport, (bus rapid transit system) BRTS, metrorail, autos
Procedia PDF Downloads 3782549 Building Education Leader Capacity through an Integrated Information and Communication Technology Leadership Model and Tool
Authors: Sousan Arafeh
Abstract:
Educational systems and schools worldwide are increasingly reliant on information and communication technology (ICT). Unfortunately, most educational leadership development programs do not offer formal curricular and/or field experiences that prepare students for managing ICT resources, personnel, and processes. The result is a steep learning curve for the leader and his/her staff and dissipated organizational energy that compromises desired outcomes. To address this gap in education leaders’ development, Arafeh’s Integrated Technology Leadership Model (AITLM) was created. It is a conceptual model and tool that educational leadership students can use to better understand the ICT ecology that exists within their schools. The AITL Model consists of six 'infrastructure types' where ICT activity takes place: technical infrastructure, communications infrastructure, core business infrastructure, context infrastructure, resources infrastructure, and human infrastructure. These six infrastructures are further divided into 16 key areas that need management attention. The AITL Model was created by critically analyzing existing technology/ICT leadership models and working to make something more authentic and comprehensive regarding school leaders’ purview and experience. The AITL Model then served as a tool when it was distributed to over 150 educational leadership students who were asked to review it and qualitatively share their reactions. Students said the model presented crucial areas of consideration that they had not been exposed to before and that the exercise of reviewing and discussing the AITL Model as a group was useful for identifying areas of growth that they could pursue in the leadership development program and in their professional settings. While development in all infrastructures and key areas was important for students’ understanding of ICT, they noted that they were least aware of the importance of the intangible area of the resources infrastructure. The AITL Model will be presented and session participants will have an opportunity to review and reflect on its impact and utility. Ultimately, the AITL Model is one that could have significant policy and practice implications. At the very least, it might help shape ICT content in educational leadership development programs through curricular and pedagogical updates.Keywords: education leadership, information and communications technology, ICT, leadership capacity building, leadership development
Procedia PDF Downloads 1162548 Research on ARQ Transmission Technique in Mars Detection Telecommunications System
Authors: Zhongfei Cai, Hui He, Changsheng Li
Abstract:
This paper studied in the automatic repeat request (ARQ) transmission technique in Mars detection telecommunications system. An ARQ method applied to proximity-1 space link protocol was proposed by this paper. In order to ensure the efficiency of data reliable transmission, this ARQ method combined these different ARQ maneuvers characteristics. Considering the Mars detection communication environments, this paper analyzed the characteristics of the saturation throughput rate, packet dropping probability, average delay and energy efficiency with different ARQ algorithms. Combined thus results with the theories of ARQ transmission technique, an ARQ transmission project in Mars detection telecommunications system was established. The simulation results showed that this algorithm had excellent saturation throughput rate and energy efficiency with low complexity.Keywords: ARQ, mars, CCSDS, proximity-1, deepspace
Procedia PDF Downloads 3402547 Density-based Denoising of Point Cloud
Authors: Faisal Zaman, Ya Ping Wong, Boon Yian Ng
Abstract:
Point cloud source data for surface reconstruction is usually contaminated with noise and outliers. To overcome this, we present a novel approach using modified kernel density estimation (KDE) technique with bilateral filtering to remove noisy points and outliers. First we present a method for estimating optimal bandwidth of multivariate KDE using particle swarm optimization technique which ensures the robust performance of density estimation. Then we use mean-shift algorithm to find the local maxima of the density estimation which gives the centroid of the clusters. Then we compute the distance of a certain point from the centroid. Points belong to outliers then removed by automatic thresholding scheme which yields an accurate and economical point surface. The experimental results show that our approach comparably robust and efficient.Keywords: point preprocessing, outlier removal, surface reconstruction, kernel density estimation
Procedia PDF Downloads 3442546 Organizational Climate being Knowledge Sharing Oriented: A Fuzzy-Set Analysis
Authors: Paulo Lopes Henriques, Carla Curado
Abstract:
According to literature, knowledge sharing behaviors are influenced by organizational values and structures, namely organizational climate. The manuscript examines the antecedents of the knowledge sharing oriented organizational climate. According to theoretical expectations the study adopts the following explanatory conditions: knowledge sharing costs, knowledge sharing incentives, perceptions of knowledge sharing contributing to performance and tenure. The study confronts results considering two groups of firms: nondigital (firms without intranet) vs digital (firms with intranet). The paper applies fsQCA technique to analyze data by using fsQCA 2.5 software (www.fsqca.com) testing several conditional arguments to explain the outcome variable. Main results strengthen claims on the relevancy of the contribution of knowledge sharing to performance. Secondly, evidence brings tenure - an explanatory condition that is associated to organizational memory – to the spotlight. The study provides an original contribution not previously addressed in literature, since it identifies the sufficient conditions sets to knowledge sharing oriented organizational climate using fsQCA, which is, to our knowledge, a novel application of the technique.Keywords: fsQCA, knowledge sharing oriented organizational climate, knowledge sharing costs, knowledge sharing incentives
Procedia PDF Downloads 3282545 New Approaches for the Handwritten Digit Image Features Extraction for Recognition
Authors: U. Ravi Babu, Mohd Mastan
Abstract:
The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.Keywords: handwritten digit recognition, distance measure, MNIST database, image features
Procedia PDF Downloads 4612544 Forensic Imaging as an Effective Learning Tool for Teaching Forensic Pathology to Undergraduate Medical Students
Authors: Vasudeva Murthy Challakere Ramaswamy
Abstract:
Background: Conventionally forensic pathology is learnt through autopsy demonstrations which carry various limitations such as unavailability of cases in the mortuary, medico-legal implication and infection. Over the years forensic pathology and science has undergone significant evolution in this digital world. Forensic imaging is a technology which can be effectively utilized for overcoming the current limitations in the undergraduate learning of forensic curriculum. Materials and methods: demonstration of forensic imaging was done using a novel technology of autopsy which has been recently introduced across the globe. Three sessions were conducted in international medical university for a total of 196 medical students. The innovative educational tool was evacuated by using quantitative questionnaire with the scoring scales between 1 to 10. Results: The mean score for acceptance of new tool was 82% and about 74% of the students recommended incorporation of the forensic imaging in the regular curriculum. 82% of students were keen on collaborative research and taking further training courses in forensic imaging. Conclusion: forensic imaging can be an effective tool and also a suitable alternative for teaching undergraduate students. This feedback also supports the fact that students favour the use of contemporary technologies in learning medicine.Keywords: forensic imaging, forensic pathology, medical students, learning tool
Procedia PDF Downloads 4802543 Overview of Research Contexts about XR Technologies in Architectural Practice
Authors: Adeline Stals
Abstract:
The transformation of architectural design practices has been underway for almost forty years due to the development and democratization of computer technology. New and more efficient tools are constantly being proposed to architects, amplifying a technological wave that sometimes stimulates them, sometimes overwhelms them, depending essentially on their digital culture and the context (socio-economic, structural, organizational) in which they work on a daily basis. Our focus is on VR, AR, and MR technologies dedicated to architecture. The commercialization of affordable headsets like the Oculus Rift, the HTC Vive or more low-tech like the Google CardBoard, makes it more accessible to benefit from these technologies. In that regard, researchers report the growing interest of these tools for architects, given the new perspectives they open up in terms of workflow, representation, collaboration, and client’s involvement. However, studies rarely mention the consequences of the sample studied on results. Our research provides an overview of VR, AR, and MR researches among a corpus of papers selected from conferences and journals. A closer look at the sample of these research projects highlights the necessity to take into consideration the context of studies in order to develop tools truly dedicated to the real practices of specific architect profiles. This literature review formalizes milestones for future challenges to address. The methodology applied is based on a systematic review of two sources of publications. The first one is the Cumincad database, which regroups publications from conferences exclusively about digital in architecture. Additionally, the second part of the corpus is based on journal publications. Journals have been selected considering their ranking on Scimago. Among the journals in the predefined category ‘architecture’ and in Quartile 1 for 2018 (last update when consulted), we have retained the ones related to the architectural design process: Design Studies, CoDesign, Architectural Science Review, Frontiers of Architectural Research and Archnet-IJAR. Beside those journals, IJAC, not classified in the ‘architecture’ category, is selected by the author for its adequacy with architecture and computing. For all requests, the search terms were ‘virtual reality’, ‘augmented reality’, and ‘mixed reality’ in title and/or keywords for papers published between 2015 and 2019 (included). This frame time is defined considering the fast evolution of these technologies in the past few years. Accordingly, the systematic review covers 202 publications. The literature review on studies about XR technologies establishes the state of the art of the current situation. It highlights that studies are mostly based on experimental contexts with controlled conditions (pedagogical, e.g.) or on practices established in large architectural offices of international renown. However, few studies focus on the strategies and practices developed by offices of smaller size, which represent the largest part of the market. Indeed, a European survey studying the architectural profession in Europe in 2018 reveals that 99% of offices are composed of less than ten people, and 71% of only one person. The study also showed that the number of medium-sized offices is continuously decreasing in favour of smaller structures. In doing so, a frontier seems to remain between the worlds of research and practice, especially for the majority of small architectural practices having a modest use of technology. This paper constitutes a reference for the next step of the research and for further worldwide researches by facilitating their contextualization.Keywords: architectural design, literature review, SME, XR technologies
Procedia PDF Downloads 1102542 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow
Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam
Abstract:
Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety
Procedia PDF Downloads 3022541 Pivoting to Fortify our Digital Self: Revealing the Need for Personal Cyber Insurance
Authors: Richard McGregor, Carmen Reaiche, Stephen Boyle
Abstract:
Cyber threats are a relatively recent phenomenon and offer cyber insurers a dynamic and intelligent peril. As individuals en mass become increasingly digitally dependent, Personal Cyber Insurance (PCI) offers an attractive option to mitigate cyber risk at a personal level. This abstract proposes a literature review that conceptualises a framework for siting Personal Cyber Insurance (PCI) within the context of cyberspace. The lack of empirical research within this domain demonstrates an immediate need to define the scope of PCI to allow cyber insurers to understand personal cyber risk threats and vectors, customer awareness, capabilities, and their associated needs. Additionally, this will allow cyber insurers to conceptualise appropriate frameworks allowing effective management and distribution of PCI products and services within a landscape often in-congruent with risk attributes commonly associated with traditional personal line insurance products. Cyberspace has provided significant improvement to the quality of social connectivity and productivity during past decades and allowed enormous capability uplift of information sharing and communication between people and communities. Conversely, personal digital dependency furnish ample opportunities for adverse cyber events such as data breaches and cyber-attacksthus introducing a continuous and insidious threat of omnipresent cyber risk–particularly since the advent of the COVID-19 pandemic and wide-spread adoption of ‘work-from-home’ practices. Recognition of escalating inter-dependencies, vulnerabilities and inadequate personal cyber behaviours have prompted efforts by businesses and individuals alike to investigate strategies and tactics to mitigate cyber risk – of which cyber insurance is a viable, cost-effective option. It is argued that, ceteris parabus, the nature of cyberspace intrinsically provides characteristic peculiarities that pose significant and bespoke challenges to cyber insurers, often in-congruent with risk attributes commonly associated with traditional personal line insurance products. These challenges include (inter alia) a paucity of historical claim/loss data for underwriting and pricing purposes, interdependencies of cyber architecture promoting high correlation of cyber risk, difficulties in evaluating cyber risk, intangibility of risk assets (such as data, reputation), lack of standardisation across the industry, high and undetermined tail risks, and moral hazard among others. This study proposes a thematic overview of the literature deemed necessary to conceptualise the challenges to issuing personal cyber coverage. There is an evident absence of empirical research appertaining to PCI and the design of operational business models for this business domain, especially qualitative initiatives that (1) attempt to define the scope of the peril, (2) secure an understanding of the needs of both cyber insurer and customer, and (3) to identify elements pivotal to effective management and profitable distribution of PCI - leading to an argument proposed by the author that postulates that the traditional general insurance customer journey and business model are ill-suited for the lineaments of cyberspace. The findings of the review confirm significant gaps in contemporary research within the domain of personal cyber insurance.Keywords: cyberspace, personal cyber risk, personal cyber insurance, customer journey, business model
Procedia PDF Downloads 1032540 Online Learning for Modern Business Models: Theoretical Considerations and Algorithms
Authors: Marian Sorin Ionescu, Olivia Negoita, Cosmin Dobrin
Abstract:
This scientific communication reports and discusses learning models adaptable to modern business problems and models specific to digital concepts and paradigms. In the PAC (probably approximately correct) learning model approach, in which the learning process begins by receiving a batch of learning examples, the set of learning processes is used to acquire a hypothesis, and when the learning process is fully used, this hypothesis is used in the prediction of new operational examples. For complex business models, a lot of models should be introduced and evaluated to estimate the induced results so that the totality of the results are used to develop a predictive rule, which anticipates the choice of new models. In opposition, for online learning-type processes, there is no separation between the learning (training) and predictive phase. Every time a business model is approached, a test example is considered from the beginning until the prediction of the appearance of a model considered correct from the point of view of the business decision. After choosing choice a part of the business model, the label with the logical value "true" is known. Some of the business models are used as examples of learning (training), which helps to improve the prediction mechanisms for future business models.Keywords: machine learning, business models, convex analysis, online learning
Procedia PDF Downloads 1412539 Image Compression Using Block Power Method for SVD Decomposition
Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed
Abstract:
In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless
Procedia PDF Downloads 3872538 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand
Authors: Leila Jafari, Viliam Makis
Abstract:
In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.Keywords: condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand
Procedia PDF Downloads 4642537 Integration of Building Information Modeling Framework for 4D Constructability Review and Clash Detection Management of a Sewage Treatment Plant
Authors: Malla Vijayeta, Y. Vijaya Kumar, N. Ramakrishna Raju, K. Satyanarayana
Abstract:
Global AEC (architecture, engineering, and construction) industry has been coined as one of the most resistive domains in embracing technology. Although this digital era has been inundated with software tools like CAD, STADD, CANDY, Microsoft Project, Primavera etc. the key stakeholders have been working in siloes and processes remain fragmented. Unlike the yesteryears’ simpler project delivery methods, the current projects are of fast-track, complex, risky, multidisciplinary, stakeholder’s influential, statutorily regulative etc. pose extensive bottlenecks in preventing timely completion of projects. At this juncture, a paradigm shift surfaced in construction industry, and Building Information Modeling, aka BIM, has been a panacea to bolster the multidisciplinary teams’ cooperative and collaborative work leading to productive, sustainable and leaner project outcome. Building information modeling has been integrative, stakeholder engaging and centralized approach in providing a common platform of communication. A common misconception that BIM can be used for building/high rise projects in Indian Construction Industry, while this paper discusses of the implementation of BIM processes/methodologies in water and waste water industry. It elucidates about BIM 4D planning and constructability reviews of a Sewage Treatment Plant in India. Conventional construction planning and logistics management involves a blend of experience coupled with imagination. Even though the excerpts or judgments or lessons learnt gained from veterans might be predictive and helpful, but the uncertainty factor persists. This paper shall delve about the case study of real time implementation of BIM 4D planning protocols for one of the Sewage Treatment Plant of Dravyavati River Rejuvenation Project in India and develops a Time Liner to identify logistics planning and clash detection. With this BIM processes, we shall find that there will be significant reduction of duplication of tasks and reworks. Also another benefit achieved will be better visualization and workarounds during conception stage and enables for early involvement of the stakeholders in the Project Life cycle of Sewage Treatment Plant construction. Moreover, we have also taken an opinion poll of the benefits accrued utilizing BIM processes versus traditional paper based communication like 2D and 3D CAD tools. Thus this paper concludes with BIM framework for Sewage Treatment Plant construction which will achieve optimal construction co-ordination advantages like 4D construction sequencing, interference checking, clash detection checking and resolutions by primary engagement of all key stakeholders thereby identifying potential risks and subsequent creation of risk response strategies. However, certain hiccups like hesitancy in adoption of BIM technology by naïve users and availability of proficient BIM trainers in India poses a phenomenal impediment. Hence the nurture of BIM processes from conception, construction and till commissioning, operation and maintenance along with deconstruction of a project’s life cycle is highly essential for Indian Construction Industry in this digital era.Keywords: integrated BIM workflow, 4D planning with BIM, building information modeling, clash detection and visualization, constructability reviews, project life cycle
Procedia PDF Downloads 1222536 Risk Analysis of Flood Physical Vulnerability in Residential Areas of Mathare Nairobi, Kenya
Authors: James Kinyua Gitonga, Toshio Fujimi
Abstract:
Vulnerability assessment and analysis is essential to solving the degree of damage and loss as a result of natural disasters. Urban flooding causes a major economic loss and casualties, at Mathare residential area in Nairobi, Kenya. High population caused by rural-urban migration, Unemployment, and unplanned urban development are among factors that increase flood vulnerability in Mathare area. This study aims to analyse flood risk physical vulnerabilities in Mathare based on scientific data, research data that includes the Rainfall data, River Mathare discharge rate data, Water runoff data, field survey data and questionnaire survey through sampling of the study area have been used to develop the risk curves. Three structural types of building were identified in the study area, vulnerability and risk curves were made for these three structural types by plotting the relationship between flood depth and damage for each structural type. The results indicate that the structural type with mud wall and mud floor is the most vulnerable building to flooding while the structural type with stone walls and concrete floor is least vulnerable. The vulnerability of building contents is mainly determined by the number of floors, where households with two floors are least vulnerable, and households with a one floor are most vulnerable. Therefore more than 80% of the residential buildings including the property in the building are highly vulnerable to floods consequently exposed to high risk. When estimating the potential casualties/injuries we discovered that the structural types of houses were major determinants where the mud/adobe structural type had casualties of 83.7% while the Masonry structural type had casualties of 10.71% of the people living in these houses. This research concludes that flood awareness, warnings and observing the building codes will enable reduce damage to the structural types of building, deaths and reduce damage to the building contents.Keywords: flood loss, Mathare Nairobi, risk curve analysis, vulnerability
Procedia PDF Downloads 2392535 Numerical Simulation of Rayleigh Benard Convection and Radiation Heat Transfer in Two-Dimensional Enclosure
Authors: Raoudha Chaabane, Faouzi Askri, Sassi Ben Nasrallah
Abstract:
A new numerical algorithm is developed to solve coupled convection-radiation heat transfer in a two dimensional enclosure. Radiative heat transfer in participating medium has been carried out using the control volume finite element method (CVFEM). The radiative transfer equations (RTE) are formulated for absorbing, emitting and scattering medium. The density, velocity and temperature fields are calculated using the two double population lattice Boltzmann equation (LBE). In order to test the efficiency of the developed method the Rayleigh Benard convection with and without radiative heat transfer is analyzed. The obtained results are validated against available works in literature and the proposed method is found to be efficient, accurate and numerically stable.Keywords: participating media, LBM, CVFEM- radiation coupled with convection
Procedia PDF Downloads 4072534 Surface Water Flow of Urban Areas and Sustainable Urban Planning
Authors: Sheetal Sharma
Abstract:
Urban planning is associated with land transformation from natural areas to modified and developed ones which leads to modification of natural environment. The basic knowledge of relationship between both should be ascertained before proceeding for the development of natural areas. Changes on land surface due to build up pavements, roads and similar land cover, affect surface water flow. There is a gap between urban planning and basic knowledge of hydrological processes which should be known to the planners. The paper aims to identify these variations in surface flow due to urbanization for a temporal scale of 40 years using Storm Water Management Mode (SWMM) and again correlating these findings with the urban planning guidelines in study area along with geological background to find out the suitable combinations of land cover, soil and guidelines. For the purpose of identifying the changes in surface flows, 19 catchments were identified with different geology and growth in 40 years facing different ground water levels fluctuations. The increasing built up, varying surface runoff are studied using Arc GIS and SWMM modeling, regression analysis for runoff. Resulting runoff for various land covers and soil groups with varying built up conditions were observed. The modeling procedures also included observations for varying precipitation and constant built up in all catchments. All these observations were combined for individual catchment and single regression curve was obtained for runoff. Thus, it was observed that alluvial with suitable land cover was better for infiltration and least generation of runoff but excess built up could not be sustained on alluvial soil. Similarly, basalt had least recharge and most runoff demanding maximum vegetation over it. Sandstone resulted in good recharging if planned with more open spaces and natural soils with intermittent vegetation. Hence, these observations made a keystone base for planners while planning various land uses on different soils. This paper contributes and provides a solution to basic knowledge gap, which urban planners face during development of natural surfaces.Keywords: runoff, built up, roughness, recharge, temporal changes
Procedia PDF Downloads 2782533 Efficient Neural and Fuzzy Models for the Identification of Dynamical Systems
Authors: Aouiche Abdelaziz, Soudani Mouhamed Salah, Aouiche El Moundhe
Abstract:
The present paper addresses the utilization of Artificial Neural Networks (ANNs) and Fuzzy Inference Systems (FISs) for the identification and control of dynamical systems with some degree of uncertainty. Because ANNs and FISs have an inherent ability to approximate functions and to adapt to changes in input and parameters, they can be used to control systems too complex for linear controllers. In this work, we show how ANNs and FISs can be put in order to form nets that can learn from external data. In sequence, it is presented structures of inputs that can be used along with ANNs and FISs to model non-linear systems. Four systems were used to test the identification and control of the structures proposed. The results show the ANNs and FISs (Back Propagation Algorithm) used were efficient in modeling and controlling the non-linear plants.Keywords: non-linear systems, fuzzy set Models, neural network, control law
Procedia PDF Downloads 2122532 Traffic Signal Control Using Citizens’ Knowledge through the Wisdom of the Crowd
Authors: Aleksandar Jovanovic, Katarina Kukic, Ana Uzelac, Dusan Teodorovic
Abstract:
Wisdom of the Crowd (WoC) is a decentralized method that uses the collective intelligence of humans. Individual guesses may be far from the target, but when considered as a group, they converge on optimal solutions for a given problem. We will utilize WoC to address the challenge of controlling traffic lights within intersections from the streets of Kragujevac, Serbia. The problem at hand falls within the category of NP-hard problems. We will employ an algorithm that leverages the swarm intelligence of bees: Bee Colony Optimization (BCO). Data regarding traffic signal timing at a single intersection will be gathered from citizens through a survey. Results obtained in that manner will be compared to the BCO results for different traffic scenarios. We will use Vissim traffic simulation software as a tool to compare the performance of bees’ and humans’ collective intelligence.Keywords: wisdom of the crowd, traffic signal control, combinatorial optimization, bee colony optimization
Procedia PDF Downloads 1082531 Phenomena-Based Approach for Automated Generation of Process Options and Process Models
Authors: Parminder Kaur Heer, Alexei Lapkin
Abstract:
Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.Keywords: Phenomena, Process intensification, Process models , Process options
Procedia PDF Downloads 2322530 Tapered Double Cantilever Beam: Evaluation of the Test Set-up for Self-Healing Polymers
Authors: Eleni Tsangouri, Xander Hillewaere, David Garoz Gómez, Dimitrios Aggelis, Filip Du Prez, Danny Van Hemelrijck
Abstract:
Tapered Double Cantilever Beam (TDCB) is the most commonly used test set-up to evaluate the self-healing feature of thermoset polymers autonomously activated in the presence of crack. TDCB is a modification of the established fracture mechanics set-up of Double Cantilever Beam and is designed to provide constant strain energy release rate with crack length under stable load evolution (mode-I). In this study, the damage of virgin and autonomously healed TDCB polymer samples is evaluated considering the load-crack opening diagram, the strain maps provided by Digital Image Correlation technique and the fractography maps given by optical microscopy. It is shown that the pre-crack introduced prior to testing (razor blade tapping), the loading rate and the length of the side groove are the features that dominate the crack propagation and lead to inconstant fracture energy release rate.Keywords: polymers, autonomous healing, fracture, tapered double cantilever beam
Procedia PDF Downloads 3512529 Reinforcement Learning the Born Rule from Photon Detection
Authors: Rodrigo S. Piera, Jailson Sales Ara´ujo, Gabriela B. Lemos, Matthew B. Weiss, John B. DeBrota, Gabriel H. Aguilar, Jacques L. Pienaar
Abstract:
The Born rule was historically viewed as an independent axiom of quantum mechanics until Gleason derived it in 1957 by assuming the Hilbert space structure of quantum measurements [1]. In subsequent decades there have been diverse proposals to derive the Born rule starting from even more basic assumptions [2]. In this work, we demonstrate that a simple reinforcement-learning algorithm, having no pre-programmed assumptions about quantum theory, will nevertheless converge to a behaviour pattern that accords with the Born rule, when tasked with predicting the output of a quantum optical implementation of a symmetric informationally-complete measurement (SIC). Our findings support a hypothesis due to QBism (the subjective Bayesian approach to quantum theory), which states that the Born rule can be thought of as a normative rule for making decisions in a quantum world [3].Keywords: quantum Bayesianism, quantum theory, quantum information, quantum measurement
Procedia PDF Downloads 1092528 Application of Seismic Refraction Method in Geotechnical Study
Authors: Abdalla Mohamed M. Musbahi
Abstract:
The study area lies in Al-Falah area on Airport-Tripoli in Zone (16) Where planned establishment of complex multi-floors for residential and commercial, this part was divided into seven subzone. In each sup zone, were collected Orthogonal profiles by using Seismic refraction method. The overall aim with this project is to investigate the applicability of Seismic refraction method is a commonly used traditional geophysical technique to determine depth-to-bedrock, competence of bedrock, depth to the water table, or depth to other seismic velocity boundaries The purpose of the work is to make engineers and decision makers recognize the importance of planning and execution of a pre-investigation program including geophysics and in particular seismic refraction method. The overall aim with this thesis is achieved by evaluation of seismic refraction method in different scales, determine the depth and velocity of the base layer (bed-rock). Calculate the elastic property in each layer in the region by using the Seismic refraction method. The orthogonal profiles was carried out in every subzones of (zone 16). The layout of the seismic refraction set up is schematically, the geophones are placed on the linear imaginary line whit a 5 m spacing, the three shot points (in beginning of layout–mid and end of layout) was used, in order to generate the P and S waves. The 1st and last shot point is placed about 5 meters from the geophones and the middle shot point is put in between 12th to 13th geophone, from time-distance curve the P and S waves was calculated and the thickness was estimated up to three-layers. As we know any change in values of physical properties of medium (shear modulus, bulk modulus, density) leads to change waves velocity which passing through medium where any change in properties of rocks cause change in velocity of waves. because the change in properties of rocks cause change in parameters of medium density (ρ), bulk modulus (κ), shear modulus (μ). Therefore, the velocity of waves which travel in rocks have close relationship with these parameters. Therefore we can estimate theses parameters by knowing primary and secondary velocity (p-wave, s-wave).Keywords: application of seismic, geotechnical study, physical properties, seismic refraction
Procedia PDF Downloads 4912527 Nonlinear Free Surface Flow Simulations Using Smoothed Particle Hydrodynamics
Authors: Abdelraheem M. Aly, Minh Tuan Nguyen, Sang-Wook Lee
Abstract:
The incompressible smoothed particle hydrodynamics (ISPH) is used to simulate impact free surface flows. In the ISPH, pressure is evaluated by solving pressure Poisson equation using a semi-implicit algorithm based on the projection method. The current ISPH method is applied to simulate dam break flow over an inclined plane with different inclination angles. The effects of inclination angle in the velocity of wave front and pressure distribution is discussed. The impact of circular cylinder over water in tank has also been simulated using ISPH method. The computed pressures on the solid boundaries is studied and compared with the experimental results.Keywords: incompressible smoothed particle hydrodynamics, free surface flow, inclined plane, water entry impact
Procedia PDF Downloads 4032526 Fuzzy Logic Based Sliding Mode Controller for a New Soft Switching Boost Converter
Authors: Azam Salimi, Majid Delshad
Abstract:
This paper presents a modified design of a sliding mode controller based on fuzzy logic for a New ZVThigh step up DC-DC Converter . Here a proportional - integral (PI)-type current mode control is employed and a sliding mode controller is designed utilizing fuzzy algorithm. Sliding mode controller guarantees robustness against all variations and fuzzy logic helps to reduce chattering phenomenon due to sliding controller, in that way efficiency increases and error, voltage and current ripples decreases. The proposed system is simulated using MATLAB / SIMULINK. This model is tested under variations of input and reference voltages and it was found that in comparison with conventional sliding mode controllers they perform better.Keywords: switching mode power supplies, DC-DC converters, sliding mode control, robustness, fuzzy control, current mode control, non-linear behavior
Procedia PDF Downloads 5392525 Learning Language through Story: Development of Storytelling Website Project for Amazighe Language Learning
Authors: Siham Boulaknadel
Abstract:
Every culture has its share of a rich history of storytelling in oral, visual, and textual form. The Amazigh language, as many languages, has its own which has entertained and informed across centuries and cultures, and its instructional potential continues to serve teachers. According to many researchers, listening to stories draws attention to the sounds of language and helps children develop sensitivity to the way language works. Stories including repetitive phrases, unique words, and enticing description encourage students to join in actively to repeat, chant, sing, or even retell the story. This kind of practice is important to language learners’ oral language development, which is believed to correlate completely with student’s academic success. Today, with the advent of multimedia, digital storytelling for instance can be a practical and powerful learning tool. It has the potential in transforming traditional learning into a world of unlimited imaginary environment. This paper reports on a research project on development of multimedia Storytelling Website using traditional Amazigh oral narratives called “tell me a story”. It is a didactic tool created for the learning of good moral values in an interactive multimedia environment combining on-screen text, graphics and audio in an enticing environment and enabling the positive values of stories to be projected. This Website developed in this study is based on various pedagogical approaches and learning theories deemed suitable for children age 8 to 9 year-old. The design and development of Website was based on a well-researched conceptual framework enabling users to: (1) re-play and share the stories in schools or at home, and (2) access the Website anytime and anywhere. Furthermore, the system stores the students work and activities over the system, allowing parents or teachers to monitor students’ works, and provide online feedback. The Website contains following main feature modules: Storytelling incorporates a variety of media such as audio, text and graphics in presenting the stories. It introduces the children to various kinds of traditional Amazigh oral narratives. The focus of this module is to project the positive values and images of stories using digital storytelling technique. Besides development good moral sense in children using projected positive images and moral values, it also allows children to practice their comprehending and listening skills. Reading module is developed based on multimedia material approach which offers the potential for addressing the challenges of reading instruction. This module is able to stimulate children and develop reading practice indirectly due to the tutoring strategies of scaffolding, self-explanation and hyperlinks offered in this module. Word Enhancement assists the children in understanding the story and appreciating the good moral values more efficiently. The difficult words or vocabularies are attached to present the explanation, which makes the children understand the vocabulary better. In conclusion, we believe that the interactive multimedia storytelling reveals an interesting and exciting tool for learning Amazigh. We plan to address some learning issues, in particularly the uses of activities to test and evaluate the children on their overall understanding of story and words presented in the learning modules.Keywords: Amazigh language, e-learning, storytelling, language teaching
Procedia PDF Downloads 4042524 Internal Leakage Analysis from Pd to Pc Port Direction in ECV Body Used in External Variable Type A/C Compressor
Authors: M. Iqbal Mahmud, Haeng Muk Cho, Seo Hyun Sang, Wang Wen Hai, Chang Heon Yi, Man Ik Hwang, Dae Hoon Kang
Abstract:
Solenoid operated electromagnetic control valve (ECV) playing an important role for car’s air conditioning control system. ECV is used in external variable displacement swash plate type compressor and controls the entire air conditioning system by means of a pulse width modulation (PWM) input signal supplying from an external source (controller). Complete form of ECV contains number of internal features like valve body, core, valve guide, plunger, guide pin, plunger spring, bellows etc. While designing the ECV; dimensions of different internal items must meet the standard requirements as it is quite challenging. In this research paper, especially the dimensioning of ECV body and its three pressure ports through which the air/refrigerant passes are considered. Here internal leakage test analysis of ECV body is being carried out from its discharge port (Pd) to crankcase port (Pc) when the guide valve is placed inside it. The experiments have made both in ordinary and digital system using different assumptions and thereafter compare the results.Keywords: electromagnetic control valve (ECV), leakage, pressure port, valve body, valve guide
Procedia PDF Downloads 4082523 Constructing Orthogonal De Bruijn and Kautz Sequences and Applications
Authors: Yaw-Ling Lin
Abstract:
A de Bruijn graph of order k is a graph whose vertices representing all length-k sequences with edges joining pairs of vertices whose sequences have maximum possible overlap (length k−1). Every Hamiltonian cycle of this graph defines a distinct, minimum length de Bruijn sequence containing all k-mers exactly once. A Kautz sequence is the minimal generating sequence so as the sequence of minimal length that produces all possible length-k sequences with the restriction that every two consecutive alphabets in the sequences must be different. A collection of de Bruijn/Kautz sequences are orthogonal if any two sequences are of maximally differ in sequence composition; that is, the maximum length of their common substring is k. In this paper, we discuss how such a collection of (maximal) orthogonal de Bruijn/Kautz sequences can be made and use the algorithm to build up a web application service for the synthesized DNA and other related biomolecular sequences.Keywords: biomolecular sequence synthesis, de Bruijn sequences, Eulerian cycle, Hamiltonian cycle, Kautz sequences, orthogonal sequences
Procedia PDF Downloads 1672522 A Visual Inspection System for Automotive Sheet Metal Chasis Parts Produced with Cold-Forming Method
Authors: İmren Öztürk Yılmaz, Abdullah Yasin Bilici, Yasin Atalay Candemir
Abstract:
The system consists of 4 main elements: motion system, image acquisition system, image processing software, and control interface. The parts coming out of the production line to enter the image processing system with the conveyor belt at the end of the line. The 3D scanning of the produced part is performed with the laser scanning system integrated into the system entry side. With the 3D scanning method, it is determined at what position and angle the parts enter the system, and according to the data obtained, parameters such as part origin and conveyor speed are calculated with the designed software, and the robot is informed about the position where it will take part. The robot, which receives the information, takes the produced part on the belt conveyor and shows it to high-resolution cameras for quality control. Measurement processes are carried out with a maximum error of 20 microns determined by the experiments.Keywords: quality control, industry 4.0, image processing, automated fault detection, digital visual inspection
Procedia PDF Downloads 113