Search results for: time delay neural network
19028 Relay Mining: Verifiable Multi-Tenant Distributed Rate Limiting
Authors: Daniel Olshansky, Ramiro Rodrıguez Colmeiro
Abstract:
Relay Mining presents a scalable solution employing probabilistic mechanisms and crypto-economic incentives to estimate RPC volume usage, facilitating decentralized multitenant rate limiting. Network traffic from individual applications can be concurrently serviced by multiple RPC service providers, with costs, rewards, and rate limiting governed by a native cryptocurrency on a distributed ledger. Building upon established research in token bucket algorithms and distributed rate-limiting penalty models, our approach harnesses a feedback loop control mechanism to adjust the difficulty of mining relay rewards, dynamically scaling with network usage growth. By leveraging crypto-economic incentives, we reduce coordination overhead costs and introduce a mechanism for providing RPC services that are both geopolitically and geographically distributed.Keywords: remote procedure call, crypto-economic, commit-reveal, decentralization, scalability, blockchain, rate limiting, token bucket
Procedia PDF Downloads 5419027 Reactivity Study on South African Calcium Based Material Using a pH-Stat and Citric Acid: A Statistical Approach
Authors: Hilary Rutto, Mbali Chiliza, Tumisang Seodigeng
Abstract:
The study on reactivity of calcined calcium-based material is very important in dry flue gas desulphurisation (FGD) process, so as to produce absorbent with high sulphur dioxide capture capacity during the hydration process. The effect of calcining temperature and time on the reactivity of calcined limestone material were investigated. In this study, the reactivity was measured using a pH stat apparatus and also confirming the result by performing citric acid reactivity test. The reactivity was calculated using the shrinking core model. Based on the experiments, a mathematical model is developed to correlate the effect of time and temperature to the reactivity of absorbent. The calcination process variables were temperature (700 -1000°C) and time (1-6 hrs). It was found that reactivity increases with an increase in time and temperature.Keywords: reactivity, citric acid, calcination, time
Procedia PDF Downloads 22019026 Large Time Asymptotic Behavior to Solutions of a Forced Burgers Equation
Authors: Satyanarayana Engu, Ahmed Mohd, V. Murugan
Abstract:
We study the large time asymptotics of solutions to the Cauchy problem for a forced Burgers equation (FBE) with the initial data, which is continuous and summable on R. For which, we first derive explicit solutions of FBE assuming a different class of initial data in terms of Hermite polynomials. Later, by violating this assumption we prove the existence of a solution to the considered Cauchy problem. Finally, we give an asymptotic approximate solution and establish that the error will be of order O(t^(-1/2)) with respect to L^p -norm, where 1≤p≤∞, for large time.Keywords: Burgers equation, Cole-Hopf transformation, Hermite polynomials, large time asymptotics
Procedia PDF Downloads 33419025 A Fundamental Study for Real-Time Safety Evaluation System of Landing Pier Using FBG Sensor
Authors: Heungsu Lee, Youngseok Kim, Jonghwa Yi, Chul Park
Abstract:
A landing pier is subjected to safety assessment by visual inspection and design data, but it is difficult to check the damage in real-time. In this study, real - time damage detection and safety evaluation methods were studied. As a result of structural analysis of the arbitrary landing pier structure, the inflection point of deformation and moment occurred at 10%, 50%, and 90% of pile length. The critical value of Fiber Bragg Grating (FBG) sensor was set according to the safety factor, and the FBG sensor application method for real - time safety evaluation was derived.Keywords: FBG sensor, harbor structure, maintenance, safety evaluation system
Procedia PDF Downloads 21819024 Crime Prevention with Artificial Intelligence
Authors: Mehrnoosh Abouzari, Shahrokh Sahraei
Abstract:
Today, with the increase in quantity and quality and variety of crimes, the discussion of crime prevention has faced a serious challenge that human resources alone and with traditional methods will not be effective. One of the developments in the modern world is the presence of artificial intelligence in various fields, including criminal law. In fact, the use of artificial intelligence in criminal investigations and fighting crime is a necessity in today's world. The use of artificial intelligence is far beyond and even separate from other technologies in the struggle against crime. Second, its application in criminal science is different from the discussion of prevention and it comes to the prediction of crime. Crime prevention in terms of the three factors of the offender, the offender and the victim, following a change in the conditions of the three factors, based on the perception of the criminal being wise, and therefore increasing the cost and risk of crime for him in order to desist from delinquency or to make the victim aware of self-care and possibility of exposing him to danger or making it difficult to commit crimes. While the presence of artificial intelligence in the field of combating crime and social damage and dangers, like an all-seeing eye, regardless of time and place, it sees the future and predicts the occurrence of a possible crime, thus prevent the occurrence of crimes. The purpose of this article is to collect and analyze the studies conducted on the use of artificial intelligence in predicting and preventing crime. How capable is this technology in predicting crime and preventing it? The results have shown that the artificial intelligence technologies in use are capable of predicting and preventing crime and can find patterns in the data set. find large ones in a much more efficient way than humans. In crime prediction and prevention, the term artificial intelligence can be used to refer to the increasing use of technologies that apply algorithms to large sets of data to assist or replace police. The use of artificial intelligence in our debate is in predicting and preventing crime, including predicting the time and place of future criminal activities, effective identification of patterns and accurate prediction of future behavior through data mining, machine learning and deep learning, and data analysis, and also the use of neural networks. Because the knowledge of criminologists can provide insight into risk factors for criminal behavior, among other issues, computer scientists can match this knowledge with the datasets that artificial intelligence uses to inform them.Keywords: artificial intelligence, criminology, crime, prevention, prediction
Procedia PDF Downloads 7519023 Considering Partially Developed Artifacts in Change Impact Analysis Implementation
Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim
Abstract:
It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.Keywords: software development, impact analysis, traceability, static analysis.
Procedia PDF Downloads 60819022 Urinary Volatile Organic Compound Testing in Fast-Track Patients with Suspected Colorectal Cancer
Authors: Godwin Dennison, C. E. Boulind, O. Gould, B. de Lacy Costello, J. Allison, P. White, P. Ewings, A. Wicaksono, N. J. Curtis, A. Pullyblank, D. Jayne, J. A. Covington, N. Ratcliffe, N. K. Francis
Abstract:
Background: Colorectal symptoms are common but only infrequently represent serious pathology, including colorectal cancer (CRC). A large number of invasive tests are presently performed for reassurance. We investigated the feasibility of urinary volatile organic compound (VOC) testing as a potential triage tool in patients fast-tracked for assessment for possible CRC. Methods: A prospective, multi-centre, observational feasibility study was performed across three sites. Patients referred on NHS fast-track pathways for potential CRC provided a urine sample which underwent Gas Chromatography Mass Spectrometry (GC-MS), Field Asymmetric Ion Mobility Spectrometry (FAIMS) and Selected Ion Flow Tube Mass Spectrometry (SIFT-MS) analysis. Patients underwent colonoscopy and/or CT colonography and were grouped as either CRC, adenomatous polyp(s), or controls to explore the diagnostic accuracy of VOC output data supported by an artificial neural network (ANN) model. Results: 558 patients participated with 23 (4.1%) CRC diagnosed. 59% of colonoscopies and 86% of CT colonographies showed no abnormalities. Urinary VOC testing was feasible, acceptable to patients, and applicable within the clinical fast track pathway. GC-MS showed the highest clinical utility for CRC and polyp detection vs. controls (sensitivity=0.878, specificity=0.882, AUROC=0.884). Conclusion: Urinary VOC testing and analysis are feasible within NHS fast-track CRC pathways. Clinically meaningful differences between patients with cancer, polyps, or no pathology were identified therefore suggesting VOC analysis may have future utility as a triage tool. Acknowledgment: Funding: NIHR Research for Patient Benefit grant (ref: PB-PG-0416-20022).Keywords: colorectal cancer, volatile organic compound, gas chromatography mass spectrometry, field asymmetric ion mobility spectrometry, selected ion flow tube mass spectrometry
Procedia PDF Downloads 9119021 Optimal Design of the Power Generation Network in California: Moving towards 100% Renewable Electricity by 2045
Authors: Wennan Long, Yuhao Nie, Yunan Li, Adam Brandt
Abstract:
To fight against climate change, California government issued the Senate Bill No. 100 (SB-100) in 2018 September, which aims at achieving a target of 100% renewable electricity by the end of 2045. A capacity expansion problem is solved in this case study using a binary quadratic programming model. The optimal locations and capacities of the potential renewable power plants (i.e., solar, wind, biomass, geothermal and hydropower), the phase-out schedule of existing fossil-based (nature gas) power plants and the transmission of electricity across the entire network are determined with the minimal total annualized cost measured by net present value (NPV). The results show that the renewable electricity contribution could increase to 85.9% by 2030 and reach 100% by 2035. Fossil-based power plants will be totally phased out around 2035 and solar and wind will finally become the most dominant renewable energy resource in California electricity mix.Keywords: 100% renewable electricity, California, capacity expansion, mixed integer non-linear programming
Procedia PDF Downloads 17119020 Application of Combined Cluster and Discriminant Analysis to Make the Operation of Monitoring Networks More Economical
Authors: Norbert Magyar, Jozsef Kovacs, Peter Tanos, Balazs Trasy, Tamas Garamhegyi, Istvan Gabor Hatvani
Abstract:
Water is one of the most important common resources, and as a result of urbanization, agriculture, and industry it is becoming more and more exposed to potential pollutants. The prevention of the deterioration of water quality is a crucial role for environmental scientist. To achieve this aim, the operation of monitoring networks is necessary. In general, these networks have to meet many important requirements, such as representativeness and cost efficiency. However, existing monitoring networks often include sampling sites which are unnecessary. With the elimination of these sites the monitoring network can be optimized, and it can operate more economically. The aim of this study is to illustrate the applicability of the CCDA (Combined Cluster and Discriminant Analysis) to the field of water quality monitoring and optimize the monitoring networks of a river (the Danube), a wetland-lake system (Kis-Balaton & Lake Balaton), and two surface-subsurface water systems on the watershed of Lake Neusiedl/Lake Fertő and on the Szigetköz area over a period of approximately two decades. CCDA combines two multivariate data analysis methods: hierarchical cluster analysis and linear discriminant analysis. Its goal is to determine homogeneous groups of observations, in our case sampling sites, by comparing the goodness of preconceived classifications obtained from hierarchical cluster analysis with random classifications. The main idea behind CCDA is that if the ratio of correctly classified cases for a grouping is higher than at least 95% of the ratios for the random classifications, then at the level of significance (α=0.05) the given sampling sites don’t form a homogeneous group. Due to the fact that the sampling on the Lake Neusiedl/Lake Fertő was conducted at the same time at all sampling sites, it was possible to visualize the differences between the sampling sites belonging to the same or different groups on scatterplots. Based on the results, the monitoring network of the Danube yields redundant information over certain sections, so that of 12 sampling sites, 3 could be eliminated without loss of information. In the case of the wetland (Kis-Balaton) one pair of sampling sites out of 12, and in the case of Lake Balaton, 5 out of 10 could be discarded. For the groundwater system of the catchment area of Lake Neusiedl/Lake Fertő all 50 monitoring wells are necessary, there is no redundant information in the system. The number of the sampling sites on the Lake Neusiedl/Lake Fertő can decrease to approximately the half of the original number of the sites. Furthermore, neighbouring sampling sites were compared pairwise using CCDA and the results were plotted on diagrams or isoline maps showing the location of the greatest differences. These results can help researchers decide where to place new sampling sites. The application of CCDA proved to be a useful tool in the optimization of the monitoring networks regarding different types of water bodies. Based on the results obtained, the monitoring networks can be operated more economically.Keywords: combined cluster and discriminant analysis, cost efficiency, monitoring network optimization, water quality
Procedia PDF Downloads 34819019 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays
Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal
Abstract:
Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).Keywords: fault tolerance, FPGA, single event upset, approximate computing
Procedia PDF Downloads 19819018 Spatial Analysis of Park and Ride Users’ Dynamic Accessibility to Train Station: A Case Study in Perth
Authors: Ting (Grace) Lin, Jianhong (Cecilia) Xia, Todd Robinson
Abstract:
Accessibility analysis, examining people’s ability to access facilities and destinations, is a fundamental assessment for transport planning, policy making, and social exclusion research. Dynamic accessibility which measures accessibility in real-time traffic environment has been an advanced accessibility indicator in transport research. It is also a useful indicator to help travelers to understand travel time daily variability, assists traffic engineers to monitor traffic congestions, and finally develop effective strategies in order to mitigate traffic congestions. This research involved real-time traffic information by collecting travel time data with 15-minute interval via the TomTom® API. A framework for measuring dynamic accessibility was then developed based on the gravity theory and accessibility dichotomy theory through space and time interpolation. Finally, the dynamic accessibility can be derived at any given time and location under dynamic accessibility spatial analysis framework.Keywords: dynamic accessibility, hot spot, transport research, TomTom® API
Procedia PDF Downloads 38919017 Historical Hashtags: An Investigation of the #CometLanding Tweets
Authors: Noor Farizah Ibrahim, Christopher Durugbo
Abstract:
This study aims to investigate how the Twittersphere reacted during the recent historical event of robotic landing on a comet. The news is about Philae, a robotic lander from European Space Agency (ESA), which successfully made the first-ever rendezvous and touchdown of its kind on a nucleus comet on November 12, 2014. In order to understand how Twitter is practically used in spreading messages on historical events, we conducted an analysis of one-week tweet feeds that contain the #CometLanding hashtag. We studied the trends of tweets, the diffusion of the information and the characteristics of the social network created. The results indicated that the use of Twitter as a platform enables online communities to engage and spread the historical event through social media network (e.g. tweets, retweets, mentions and replies). In addition, it was found that comprehensible and understandable hashtags could influence users to follow the same tweet stream compared to other laborious hashtags which were difficult to understand by users in online communities.Keywords: diffusion of information, hashtag, social media, Twitter
Procedia PDF Downloads 32519016 Investigating the Effect of VR, Time Study and Ergonomics on the Design of Industrial Workstations
Authors: Aydin Azizi, Poorya Ghafoorpoor Yazdi
Abstract:
This paper presents the review of the studies on the ergonomics, virtual reality, and work measurement (time study) at the industrial workstations because each of these three individual techniques can be used to improve the design of workstations and task position. The objective of this paper is to give an overall literature review that if there is any relation between these three different techniques. Therefore, it is so important to review the scientific studies to find a better and effective way for improving design of workstations. On the other hand, manufacturers found that instead of using one of the approaches, utilizing the combination of these individual techniques are more effective to reduce the cost and production time.Keywords: ergonomics, time study, virtual reality, workplace
Procedia PDF Downloads 11919015 Time Bound Parallel Processing of a Disaster Management Alert System Using Random Selection of Target Audience: Bangladesh Context
Authors: Hasan Al Bashar Abul Ulayee, AKM Saifun Nabi, MD Mesbah-Ul-Awal
Abstract:
Alert system for disaster management is common now a day and can play a vital role reducing devastation and saves lives and costs. An alert in right time can save thousands of human life, help to take shelter, manage other assets including live stocks and above all, a right time alert will help to take preparation to face and early recovery of the situation. In a country like Bangladesh where populations is more than 170 million and always facing different types of natural calamities and disasters, an early right time alert is very effective and implementation of alert system is challenging. The challenge comes from the time constraint of alerting the huge number of population. The other method of existing disaster management pre alert is traditional, sequential and non-selective so efficiency is not good enough. This paper describes a way by which alert can be provided to maximum number of people within the short time bound using parallel processing as well as random selection of selective target audience.Keywords: alert system, Bangladesh, disaster management, parallel processing, SMS
Procedia PDF Downloads 47019014 Traumatic Brain Injury Neurosurgical Care Continuum Delays in Mulago Hospital in Kampala Uganda
Authors: Silvia D. Vaca, Benjamin J. Kuo, Joao Ricardo Nickenig Vissoci, Catherine A. Staton, Linda W. Xu, Michael Muhumuza, Hussein Ssenyonjo, John Mukasa, Joel Kiryabwire, Henry E. Rice, Gerald A. Grant, Michael M. Haglund
Abstract:
Background: Patients with traumatic brain injury (TBI) can develop rapid neurological deterioration from swelling and intracranial hematomas, which can result in focal tissue ischemia, brain compression, and herniation. Moreover, delays in management increase the risk of secondary brain injury from hypoxemia and hypotension. Therefore, in TBI patients with subdural hematomas (SDHs) and epidural hematomas (EDHs), surgical intervention is both necessary and time sensitive. Significant delays are seen along the care continuum in low- and middle-income countries (LMICs) largely due to limited healthcare capacity to address the disproportional rates of TBI in Sub Saharan Africa (SSA). While many LMICs have subsidized systems to offset surgical costs, the burden of securing funds by the patients for medications, supplies, and CT diagnostics poses a significant challenge to timely surgical interventions. In Kampala Uganda, the challenge of obtaining timely CT scans is twofold: logistical and financial barriers. These bottlenecks contribute significantly to the care continuum delays and are associated with poor TBI outcomes. Objective: The objectives of this study are to 1) describe the temporal delays through a modified three delays model that fits the context of neurosurgical interventions for TBI patients in Kampala and 2) investigate the association between delays and mortality. Methods: Prospective data were collected for 563 TBI patients presenting to a tertiary hospital in Kampala from 1 June – 30 November 2016. Four time intervals were constructed along five time points: injury, hospital arrival, neurosurgical evaluation, CT results, and definitive surgery. Time interval differences among mild, moderate and severe TBI and their association with mortality were analyzed. Results: The mortality rate of all TBI patients presenting to MNRH was 9.6%, which ranged from 4.7% for mild and moderate TBI patients receiving surgery to 81.8% for severe TBI patients who failed to receive surgery. The duration from injury to surgery varied considerably across TBI severity with the largest gap seen between mild TBI (174 hours) and severe TBI (69 hours) patients. Further analysis revealed care continuum differences for interval 3 (neurosurgical evaluation to CT result) and 4 (CT result to surgery) between severe TBI patients (7 hours for interval 3 and 24 hours for interval 4) and mild TBI patients (19 hours for interval 3, and 96 hours for interval 4). These post-arrival delays were associated with mortality for mild (p=0.05) and moderate TBI (p=0.03) patients. Conclusions: To our knowledge, this is the first analysis using a modified 'three delays' framework to analyze the care continuum of TBI patients in Uganda from injury to surgery. We found significant associations between delays and mortality for mild and moderate TBI patients. As it currently stands, poorer outcomes were observed for these mild and moderate TBI patients who were managed non-operatively or failed to receive surgery while surgical services were shunted to more severely ill patients. While well intentioned, high mortality rates were still observed for the severe TBI patients managed surgically. These results suggest the need for future research to optimize triage practices, understand delay contributors, and improve pre-hospital logistical referral systems.Keywords: care continuum, global neurosurgery, Kampala Uganda, LMIC, Mulago, traumatic brain injury
Procedia PDF Downloads 22019013 H2 Production and Treatment of Cake Wastewater Industry via Up-Flow Anaerobic Staged Reactor
Authors: Manal A. Mohsen, Ahmed Tawfik
Abstract:
Hydrogen production from cake wastewater by anaerobic dark fermentation via upflow anaerobic staged reactor (UASR) was investigated in this study. The reactor was continuously operated for four months at constant hydraulic retention time (HRT) of 21.57 hr, PH value of 6 ± 0.6, temperature of 21.1°C, and organic loading rate of 2.43 gCOD/l.d. The hydrogen production was 5.7 l H2/d and the hydrogen yield was 134.8 ml H2 /g CODremoved. The system showed an overall removal efficiency of TCOD, TBOD, TSS, TKN, and Carbohydrates of 40 ± 13%, 59 ± 18%, 84 ± 17%, 28 ± 27%, and 85 ± 15% respectively during the long term operation period. Based on the available results, the system is not sufficient for the effective treatment of cake wastewater, and the effluent quality of UASR is not complying for discharge into sewerage network, therefore a post treatment is needed (not covered in this study).Keywords: cake wastewater industry, chemical oxygen demand (COD), hydrogen production, up-flow anaerobic staged reactor (UASR)
Procedia PDF Downloads 38019012 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks
Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang
Abstract:
The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation
Procedia PDF Downloads 15619011 Continuous-Time and Discrete-Time Singular Value Decomposition of an Impulse Response Function
Authors: Rogelio Luck, Yucheng Liu
Abstract:
This paper proposes the continuous-time singular value decomposition (SVD) for the impulse response function, a special kind of Green’s functions e⁻⁽ᵗ⁻ ᵀ⁾, in order to find a set of singular functions and singular values so that the convolutions of such function with the set of singular functions on a specified domain are the solutions to the inhomogeneous differential equations for those singular functions. A numerical example was illustrated to verify the proposed method. Besides the continuous-time SVD, a discrete-time SVD is also presented for the impulse response function, which is modeled using a Toeplitz matrix in the discrete system. The proposed method has broad applications in signal processing, dynamic system analysis, acoustic analysis, thermal analysis, as well as macroeconomic modeling.Keywords: singular value decomposition, impulse response function, Green’s function , Toeplitz matrix , Hankel matrix
Procedia PDF Downloads 15619010 Maximizing Profit Using Optimal Control by Exploiting the Flexibility in Thermal Power Plants
Authors: Daud Mustafa Minhas, Raja Rehan Khalid, Georg Frey
Abstract:
The next generation power systems are equipped with abundantly available free renewable energy resources (RES). During their low-cost operations, the price of electricity significantly reduces to a lower value, and sometimes it becomes negative. Therefore, it is recommended not to operate the traditional power plants (e.g. coal power plants) and to reduce the losses. In fact, it is not a cost-effective solution, because these power plants exhibit some shutdown and startup costs. Moreover, they require certain time for shutdown and also need enough pause before starting up again, increasing inefficiency in the whole power network. Hence, there is always a trade-off between avoiding negative electricity prices, and the startup costs of power plants. To exploit this trade-off and to increase the profit of a power plant, two main contributions are made: 1) introducing retrofit technology for state of art coal power plant; 2) proposing optimal control strategy for a power plant by exploiting different flexibility features. These flexibility features include: improving ramp rate of power plant, reducing startup time and lowering minimum load. While, the control strategy is solved as mixed integer linear programming (MILP), ensuring optimal solution for the profit maximization problem. Extensive comparisons are made considering pre and post-retrofit coal power plant having the same efficiencies under different electricity price scenarios. It concludes that if the power plant must remain in the market (providing services), more flexibility reflects direct economic advantage to the plant operator.Keywords: discrete optimization, power plant flexibility, profit maximization, unit commitment model
Procedia PDF Downloads 14319009 A Learning Automata Based Clustering Approach for Underwater Sensor Networks to Reduce Energy Consumption
Authors: Motahareh Fadaei
Abstract:
Wireless sensor networks that are used to monitor a special environment, are formed from a large number of sensor nodes. The role of these sensors is to sense special parameters from ambient and to make connection. In these networks, the most important challenge is the management of energy usage. Clustering is one of the methods that are broadly used to face this challenge. In this paper, a distributed clustering protocol based on learning automata is proposed for underwater wireless sensor networks. The proposed algorithm that is called LA-Clustering forms clusters in the same energy level, based on the energy level of nodes and the connection radius regardless of size and the structure of sensor network. The proposed approach is simulated and is compared with some other protocols with considering some metrics such as network lifetime, number of alive nodes, and number of transmitted data. The simulation results demonstrate the efficiency of the proposed approach.Keywords: clustering, energy consumption, learning automata, underwater sensor networks
Procedia PDF Downloads 31419008 Numerical Simulation of Plasma Actuator Using OpenFOAM
Authors: H. Yazdani, K. Ghorbanian
Abstract:
This paper deals with modeling and simulation of the plasma actuator with OpenFOAM. Plasma actuator is one of the newest devices in flow control techniques which can delay separation by inducing external momentum to the boundary layer of the flow. The effects of the plasma actuators on the external flow are incorporated into Navier-Stokes computations as a body force vector which is obtained as a product of the net charge density and the electric field. In order to compute this body force vector, the model solves two equations: One for the electric field due to the applied AC voltage at the electrodes and the other for the charge density representing the ionized air. The simulation result is compared to the experimental and typical values which confirms the validity of the modeling.Keywords: active flow control, flow-field, OpenFOAM, plasma actuator
Procedia PDF Downloads 30619007 Inferential Reasoning for Heterogeneous Multi-Agent Mission
Authors: Sagir M. Yusuf, Chris Baber
Abstract:
We describe issues bedeviling the coordination of heterogeneous (different sensors carrying agents) multi-agent missions such as belief conflict, situation reasoning, etc. We applied Bayesian and agents' presumptions inferential reasoning to solve the outlined issues with the heterogeneous multi-agent belief variation and situational-base reasoning. Bayesian Belief Network (BBN) was used in modeling the agents' belief conflict due to sensor variations. Simulation experiments were designed, and cases from agents’ missions were used in training the BBN using gradient descent and expectation-maximization algorithms. The output network is a well-trained BBN for making inferences for both agents and human experts. We claim that the Bayesian learning algorithm prediction capacity improves by the number of training data and argue that it enhances multi-agents robustness and solve agents’ sensor conflicts.Keywords: distributed constraint optimization problem, multi-agent system, multi-robot coordination, autonomous system, swarm intelligence
Procedia PDF Downloads 15419006 The Modelling of Real Time Series Data
Authors: Valeria Bondarenko
Abstract:
We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.Keywords: mathematical model, random process, Wiener process, fractional Brownian motion
Procedia PDF Downloads 35819005 Real-Time Aerial Marine Surveillance System for Safe Navigation
Authors: Vinesh Thiruchelvam, Umar Mumtaz Chowdry, Sathish Kumar Selvaperumal
Abstract:
The prime purpose of the project is to provide a sophisticated system for surveillance specialized for the Port Authorities in the Maritime Industry. The current aerial surveillance does not have a wide dimensioning view. The channels of communication is shared and not exclusive allowing for communications errors or disturbance mainly due to traffic. The scope is to analyze the various aspects as real-time aerial and marine surveillance is one of the most important methods which could ensure the domain security of the sailors. The system will improve real time data as obtained for the controller base station. The key implementation will be based on camera speed, angle and adherence to a sustainable power utilization module.Keywords: SMS, real time, GUI, maritime industry
Procedia PDF Downloads 49819004 A Study for the Effect of Fire Initiated Location on Evacuation Success Rate
Authors: Jin A Ryu, Hee Sun Kim
Abstract:
As the number of fire accidents is gradually raising, many studies have been reported on evacuation. Previous studies have mostly focused on evaluating the safety of evacuation and the risk of fire in particular buildings. However, studies on effects of various parameters on evacuation have not been nearly done. Therefore, this paper aims at observing evacuation time under the effect of fire initiated location. In this study, evacuation simulations are performed on a 5-floor building located in Seoul, South Korea using the commercial program, Fire Dynamics Simulator with Evacuation (FDS+EVAC). Only the fourth and fifth floors are modeled with an assumption that fire starts in a room located on the fourth floor. The parameter for evacuation simulations is location of fire initiation to observe the evacuation time and safety. Results show that the location of fire initiation is closer to exit, the more time is taken to evacuate. The case having the nearest location of fire initiation to exit has the lowest ratio of successful occupants to the total occupants. In addition, for safety evaluation, the evacuation time calculated from computer simulation model is compared with the tolerable evacuation time according to code in Japan. As a result, all cases are completed within the tolerable evacuation time. This study allows predicting evacuation time under various conditions of fire and can be used to evaluate evacuation appropriateness and fire safety of building.Keywords: fire simulation, evacuation simulation, temperature, evacuation safety
Procedia PDF Downloads 34919003 The Benefits of Security Culture for Improving Physical Protection Systems at Detection and Radiation Measurement Laboratory
Authors: Ari S. Prabowo, Nia Febriyanti, Haryono B. Santosa
Abstract:
Security function that is called as Physical Protection Systems (PPS) has functions to detect, delay and response. Physical Protection Systems (PPS) in Detection and Radiation Measurement Laboratory needs to be improved continually by using internal resources. The nuclear security culture provides some potentials to support this research. The study starts by identifying the security function’s weaknesses and its strengths of security culture as a purpose. Secondly, the strengths of security culture are implemented in the laboratory management. Finally, a simulation was done to measure its effectiveness. Some changes were happened in laboratory personnel behaviors and procedures. All became more prudent. The results showed a good influence of nuclear security culture in laboratory security functions.Keywords: laboratory, physical protection system, security culture, security function
Procedia PDF Downloads 18519002 Performance Evaluation of a Very High-Resolution Satellite Telescope
Authors: Walid A. Attia, Taher M. Bazan, Fawzy Eltohamy, Mahmoud Fathy
Abstract:
System performance evaluation is an essential stage in the design of high-resolution satellite telescopes prior to the development process. In this paper, a system performance evaluation of a very high-resolution satellite telescope is investigated. The evaluated system has a Korsch optical scheme design. This design has been discussed in another paper with respect to three-mirror anastigmat (TMA) scheme design and the former configuration showed better results. The investigated system is based on the Korsch optical design integrated with a time-delay and integration charge coupled device (TDI-CCD) sensor to achieve a ground sampling distance (GSD) of 25 cm. The key performance metrics considered are the spatial resolution, the signal to noise ratio (SNR) and the total modulation transfer function (MTF) of the system. In addition, the national image interpretability rating scale (NIIRS) metric is assessed to predict the image quality according to the modified general image quality equation (GIQE). Based on the orbital, optical and detector parameters, the estimated GSD is found to be 25 cm. The SNR has been analyzed at different illumination conditions of target albedos, sun and sensor angles. The system MTF has been computed including diffraction, aberration, optical manufacturing, smear and detector sampling as the main contributors for evaluation the MTF. Finally, the system performance evaluation results show that the computed MTF value is found to be around 0.08 at the Nyquist frequency, the SNR value was found to be 130 at albedo 0.2 with a nadir viewing angles and the predicted NIIRS is in the order of 6.5 which implies a very good system image quality.Keywords: modulation transfer function, national image interpretability rating scale, signal to noise ratio, satellite telescope performance evaluation
Procedia PDF Downloads 38419001 Uncertain Time-Cost Trade off Problems of Construction Projects Using Fuzzy Set Theory
Authors: V. S. S. Kumar, B. Vikram
Abstract:
The development of effective decision support tools that adopted in the construction industry is vital in the world we live in today, since it can lead to substantial cost reduction and efficient resource consumption. Solving the time-cost trade off problems and its related variants is at the heart of scientific research for optimizing construction planning problems. In general, the classical optimization techniques have difficulties in dealing with TCT problems. One of the main reasons of their failure is that they can easily be entrapped in local minima. This paper presents an investigation on the application of meta-heuristic techniques to two particular variants of the time-cost trade of analysis, the time-cost trade off problem (TCT), and time-cost trade off optimization problem (TCO). In first problem, the total project cost should be minimized, and in the second problem, the total project cost and total project duration should be minimized simultaneously. Finally it is expected that, the optimization models developed in this paper will contribute significantly for efficient planning and management of construction project.Keywords: fuzzy sets, uncertainty, optimization, time cost trade off problems
Procedia PDF Downloads 35619000 Cellular Architecture of Future Wireless Communication Networks
Authors: Mohammad Yahaghifar
Abstract:
Nowadays Wireless system designers have been facing the continuously increasing demand for high data rates and mobility required by new wireless applications. Evolving future communication network generation cellular wireless networks are envisioned to overcome the fundamental challenges of existing cellular networks, for example, higher data rates, excellent end-to-end performance, and user coverage in hot-spots and crowded areas with lower latency,energy consumption and cost per information transfer. In this paper we propose a potential cellular architecture that separates indoor and outdoor scenarios and discuss various promising technologies for future wireless communication systemssystems, such as massive MIMO, energy-efficient communications,cognitive radio networks, and visible light communications and we disscuse about 5G that is next generation of wireless networks.Keywords: future challenges in networks, cellur architecture, visible light communication, 5G wireless technologies, spatial modulation, massiva mimo, cognitive radio network, green communications
Procedia PDF Downloads 48818999 A Survey on the Supervision Experience of Full-Time Intern Counseling Psychologist
Authors: Szu-Fan Chen, Cheng-Tseng Lin, Ting-Chia Lien
Abstract:
This study mainly focuses on understanding the current supervision experience of full-time intern counseling psychologists in Taiwan. This study took 197 full-time intern counseling psychologists as the research subjects, including 146 women (74%) and 51 men (26%). In terms of internship sites, the largest number of internships are in school sites (59%), followed by community sites (30%), and fewer in medical fields or corporate sites (only 11%). In addition, a survey was conducted on whether the subjects had full-time jobs before full-time internship. 42% did not have full-time workers, and 48% had full-time workers. However, among those who had full-time workers, 28% were engaged in work related to psychological counseling. 20% are engaged in work unrelated to psychological counseling. In the sample of this study, each person interviewed on average 2.68 internship institutions in total, and the current internship unit is the 2.29th institution interviewed. All (100%) full-time intern psychologists have entered into individual internship contracts with internship institutions. In terms of professional supervisor candidates, a total of 178 (90%) supervisors were appointed by internal personnel of the institution, and a total of 19 (10%) were hired as supervisors from outside the institution. Regarding the form of supervision, it is mostly conducted through individual supervision (98%), and up to 60% is conducted through discussion of written/oral case reports. In terms of supervision satisfaction, 47% were very satisfied, 28% were satisfied, 18% were OK, and 6% were dissatisfied. It is worth noting that the results of this study show that full-time intern counseling psychologists said that they are under pressure to accept supervision (30%). It is recommended that the internship system should standardize the qualification review and evaluation of internship institutions to facilitate institutional control. Furthermore, the personal difficulties of full-time intern psychologists need to be discussed with the internship institution and supervisor from time to time to jointly assist them in completing their professional studies stably and successfully. Finally, it is recommended that future researchers can use the interview method provided by the author to strengthen their understanding of the supervision experience of full-time intern counseling psychologists, so that in the future, this study can provide relevant specific and feasible suggestions for counseling practitioners and future researchers' reference.Keywords: full-time intern counseling psychologist, supervision experience, full-time intership, supervision
Procedia PDF Downloads 21