Search results for: Just in Time
16952 Automating 2D CAD to 3D Model Generation Process: Wall pop-ups
Authors: Mohit Gupta, Chialing Wei, Thomas Czerniawski
Abstract:
In this paper, we have built a neural network that can detect walls on 2D sheets and subsequently create a 3D model in Revit using Dynamo. The training set includes 3500 labeled images, and the detection algorithm used is YOLO. Typically, engineers/designers make concentrated efforts to convert 2D cad drawings to 3D models. This costs a considerable amount of time and human effort. This paper makes a contribution in automating the task of 3D walls modeling. 1. Detecting Walls in 2D cad and generating 3D pop-ups in Revit. 2. Saving designer his/her modeling time in drafting elements like walls from 2D cad to 3D representation. An object detection algorithm YOLO is used for wall detection and localization. The neural network is trained over 3500 labeled images of size 256x256x3. Then, Dynamo is interfaced with the output of the neural network to pop-up 3D walls in Revit. The research uses modern technological tools like deep learning and artificial intelligence to automate the process of generating 3D walls without needing humans to manually model them. Thus, contributes to saving time, human effort, and money.Keywords: neural networks, Yolo, 2D to 3D transformation, CAD object detection
Procedia PDF Downloads 14416951 Identification of Classes of Bilinear Time Series Models
Authors: Anthony Usoro
Abstract:
In this paper, two classes of bilinear time series model are obtained under certain conditions from the general bilinear autoregressive moving average model. Bilinear Autoregressive (BAR) and Bilinear Moving Average (BMA) Models have been identified. From the general bilinear model, BAR and BMA models have been proved to exist for q = Q = 0, => j = 0, and p = P = 0, => i = 0 respectively. These models are found useful in modelling most of the economic and financial data.Keywords: autoregressive model, bilinear autoregressive model, bilinear moving average model, moving average model
Procedia PDF Downloads 40716950 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 15916949 Optimization of Reaction Parameters' Influences on Production of Bio-Oil from Fast Pyrolysis of Oil Palm Empty Fruit Bunch Biomass in a Fluidized Bed Reactor
Authors: Chayanoot Sangwichien, Taweesak Reungpeerakul, Kyaw Thu
Abstract:
Oil palm mills in Southern Thailand produced a large amount of biomass solid wastes. Lignocellulose biomass is the main source for production of biofuel which can be combined or used as an alternative to fossil fuels. Biomass composed of three main constituents of cellulose, hemicellulose, and lignin. Thermochemical conversion process applied to produce biofuel from biomass. Pyrolysis of biomass is the best way to thermochemical conversion of biomass into pyrolytic products (bio-oil, gas, and char). Operating parameters play an important role to optimize the product yields from fast pyrolysis of biomass. This present work concerns with the modeling of reaction kinetics parameters for fast pyrolysis of empty fruit bunch in the fluidized bed reactor. A global kinetic model used to predict the product yields from fast pyrolysis of empty fruit bunch. The reaction temperature and vapor residence time parameters are mainly affected by product yields of EFB pyrolysis. The reaction temperature and vapor residence time parameters effects on empty fruit bunch pyrolysis are considered at the reaction temperature in the range of 450-500˚C and at a vapor residence time of 2 s, respectively. The optimum simulated bio-oil yield of 53 wt.% obtained at the reaction temperature and vapor residence time of 450˚C and 2 s, 500˚C and 1 s, respectively. The simulated data are in good agreement with the reported experimental data. These simulated data can be applied to the performance of experiment work for the fast pyrolysis of biomass.Keywords: kinetics, empty fruit bunch, fast pyrolysis, modeling
Procedia PDF Downloads 21416948 Coupling Time-Domain Analysis for Dynamic Positioning during S-Lay Installation
Authors: Sun Li-Ping, Zhu Jian-Xun, Liu Sheng-Nan
Abstract:
In order to study the performance of dynamic positioning system during S-lay operations, dynamic positioning system is simulated with the hull-stinger-pipe coupling effect. The roller of stinger is simulated by the generalized elastic contact theory. The stinger is composed of Morrison members. Force on pipe is calculated by lumped mass method. Time domain of fully coupled barge model is analyzed combining with PID controller, Kalman filter and allocation of thrust using Sequential Quadratic Programming method. It is also analyzed that the effect of hull wave frequency motion on pipe-stinger coupling force and dynamic positioning system. Besides, it is studied that how S-lay operations affect the dynamic positioning accuracy. The simulation results are proved to be available by checking pipe stress with API criterion. The effect of heave and yaw motion cannot be ignored on hull-stinger-pipe coupling force and dynamic positioning system. It is important to decrease the barge’s pitch motion and lay pipe in head sea in order to improve safety of the S-lay installation and dynamic positioning.Keywords: S-lay operation, dynamic positioning, coupling motion, time domain, allocation of thrust
Procedia PDF Downloads 46516947 Supply Air Pressure Control of HVAC System Using MPC Controller
Authors: P. Javid, A. Aeenmehr, J. Taghavifar
Abstract:
In this paper, supply air pressure of HVAC system has been modeled with second-order transfer function plus dead-time. In HVAC system, the desired input has step changes, and the output of proposed control system should be able to follow the input reference, so the idea of using model based predictive control is proceeded and designed in this paper. The closed loop control system is implemented in MATLAB software and the simulation results are provided. The simulation results show that the model based predictive control is able to control the plant properly.Keywords: air conditioning system, GPC, dead time, air supply control
Procedia PDF Downloads 52716946 A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems
Authors: Jalil Boudjadar
Abstract:
Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.Keywords: time-critical systems, multicore systems, schedulability analysis, energy consumption, performance analysis
Procedia PDF Downloads 10716945 On the Grid Technique by Approximating the Derivatives of the Solution of the Dirichlet Problems for (1+1) Dimensional Linear Schrodinger Equation
Authors: Lawrence A. Farinola
Abstract:
Four point implicit schemes for the approximation of the first and pure second order derivatives for the solution of the Dirichlet problem for one dimensional Schrodinger equation with respect to the time variable t were constructed. Also, special four-point implicit difference boundary value problems are proposed for the first and pure second derivatives of the solution with respect to the spatial variable x. The Grid method is also applied to the mixed second derivative of the solution of the Linear Schrodinger time-dependent equation. It is assumed that the initial function belongs to the Holder space C⁸⁺ᵃ, 0 < α < 1, the Schrodinger wave function given in the Schrodinger equation is from the Holder space Cₓ,ₜ⁶⁺ᵃ, ³⁺ᵃ/², the boundary functions are from C⁴⁺ᵃ, and between the initial and the boundary functions the conjugation conditions of orders q = 0,1,2,3,4 are satisfied. It is proven that the solution of the proposed difference schemes converges uniformly on the grids of the order O(h²+ k) where h is the step size in x and k is the step size in time. Numerical experiments are illustrated to support the analysis made.Keywords: approximation of derivatives, finite difference method, Schrödinger equation, uniform error
Procedia PDF Downloads 12116944 Analyzing the Market Growth in Application Programming Interface Economy Using Time-Evolving Model
Authors: Hiroki Yoshikai, Shin’ichi Arakawa, Tetsuya Takine, Masayuki Murata
Abstract:
API (Application Programming Interface) economy is expected to create new value by converting corporate services such as information processing and data provision into APIs and using these APIs to connect services. Understanding the dynamics of a market of API economy under the strategies of participants is crucial to fully maximize the values of the API economy. To capture the behavior of a market in which the number of participants changes over time, we present a time-evolving market model for a platform in which API providers who provide APIs to service providers participate in addition to service providers and consumers. Then, we use the market model to clarify the role API providers play in expanding market participants and forming ecosystems. The results show that the platform with API providers increased the number of market participants by 67% and decreased the cost to develop services by 25% compared to the platform without API providers. Furthermore, during the expansion phase of the market, it is found that the profits of participants are mostly the same when 70% of the revenue from consumers is distributed to service providers and API providers. It is also found that when the market is mature, the profits of the service provider and API provider will decrease significantly due to their competition, and the profit of the platform increases.Keywords: API economy, ecosystem, platform, API providers
Procedia PDF Downloads 9116943 Does Pakistan Stock Exchange Offer Diversification Benefits to Regional and International Investors: A Time-Frequency (Wavelets) Analysis
Authors: Syed Jawad Hussain Shahzad, Muhammad Zakaria, Mobeen Ur Rehman, Saniya Khaild
Abstract:
This study examines the co-movement between the Pakistan, Indian, S&P 500 and Nikkei 225 stock markets using weekly data from 1998 to 2013. The time-frequency relationship between the selected stock markets is conducted by using measures of continuous wavelet power spectrum, cross-wavelet transform and cross (squared) wavelet coherency. The empirical evidence suggests strong dependence between Pakistan and Indian stock markets. The co-movement of Pakistani index with U.S and Japanese, the developed markets, varies over time and frequency where the long-run relationship is dominant. The results of cross wavelet and wavelet coherence analysis indicate moderate covariance and correlation between stock indexes and the markets are in phase (i.e. cyclical in nature) over varying durations. Pakistan stock market was lagging during the entire period in relation to Indian stock market, corresponding to the 8~32 and then 64~256 weeks scale. Similar findings are evident for S&P 500 and Nikkei 225 indexes, however, the relationship occurs during the later period of study. All three wavelet indicators suggest strong evidence of higher co-movement during 2008-09 global financial crises. The empirical analysis reveals a strong evidence that the portfolio diversification benefits vary across frequencies and time. This analysis is unique and have several practical implications for regional and international investors while assigning the optimal weightage of different assets in portfolio formulation.Keywords: co-movement, Pakistan stock exchange, S&P 500, Nikkei 225, wavelet analysis
Procedia PDF Downloads 35716942 General Network with Four Nodes and Four Activities with Triangular Fuzzy Number as Activity Times
Authors: Rashmi Tamhankar, Madhav Bapat
Abstract:
In many projects, we have to use human judgment for determining the duration of the activities which may vary from person to person. Hence, there is vagueness about the time duration for activities in network planning. Fuzzy sets can handle such vague or imprecise concepts and has an application to such network. The vague activity times can be represented by triangular fuzzy numbers. In this paper, a general network with fuzzy activity times is considered and conditions for the critical path are obtained also we compute total float time of each activity. Several numerical examples are discussed.Keywords: PERT, CPM, triangular fuzzy numbers, fuzzy activity times
Procedia PDF Downloads 47316941 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 10616940 Cyclic Heating Effect on Hardness of Copper
Authors: Tahany W. Sadak
Abstract:
Presented work discusses research results concerning the effect of the heat treatment process. Thermal fatigue which expresses repeated heating and cooling processes affect the ductility or the brittleness of the material. In this research, 70 specimens of copper (1.5 mm thickness, 85 mm length, 32 mm width) are subjected to thermal fatigue at different conditions. Heating temperatures Th are 100, 300 and 500 °C. Number of repeated cycles N is from 1 to 100. Heating time th =600 Sec, and Cooling time; tC= 900 Sec. Results are evaluated and then compared to each other and to that of specimens without subjected to thermal fatigue.Keywords: copper, thermal analysis, heat treatment, hardness, thermal fatigue
Procedia PDF Downloads 43416939 Using Gaussian Process in Wind Power Forecasting
Authors: Hacene Benkhoula, Mohamed Badreddine Benabdella, Hamid Bouzeboudja, Abderrahmane Asraoui
Abstract:
The wind is a random variable difficult to master, for this, we developed a mathematical and statistical methods enable to modeling and forecast wind power. Gaussian Processes (GP) is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space or time and space. GP is an underlying process formed by unrecognized operator’s uses to solve a problem. The purpose of this paper is to present how to forecast wind power by using the GP. The Gaussian process method for forecasting are presented. To validate the presented approach, a simulation under the MATLAB environment has been given.Keywords: wind power, Gaussien process, modelling, forecasting
Procedia PDF Downloads 41816938 Secure Automatic Key SMS Encryption Scheme Using Hybrid Cryptosystem: An Approach for One Time Password Security Enhancement
Authors: Pratama R. Yunia, Firmansyah, I., Ariani, Ulfa R. Maharani, Fikri M. Al
Abstract:
Nowadays, notwithstanding that the role of SMS as a means of communication has been largely replaced by online applications such as WhatsApp, Telegram, and others, the fact that SMS is still used for certain and important communication needs is indisputable. Among them is for sending one time password (OTP) as an authentication media for various online applications ranging from chatting, shopping to online banking applications. However, the usage of SMS does not pretty much guarantee the security of transmitted messages. As a matter of fact, the transmitted messages between BTS is still in the form of plaintext, making it extremely vulnerable to eavesdropping, especially if the message is confidential, for instance, the OTP. One solution to overcome this problem is to use an SMS application which provides security services for each transmitted message. Responding to this problem, in this study, an automatic key SMS encryption scheme was designed as a means to secure SMS communication. The proposed scheme allows SMS sending, which is automatically encrypted with keys that are constantly changing (automatic key update), automatic key exchange, and automatic key generation. In terms of the security method, the proposed scheme applies cryptographic techniques with a hybrid cryptosystem mechanism. Proofing the proposed scheme, a client to client SMS encryption application was developed using Java platform with AES-256 as encryption algorithm, RSA-768 as public and private key generator and SHA-256 for message hashing function. The result of this study is a secure automatic key SMS encryption scheme using hybrid cryptosystem which can guarantee the security of every transmitted message, so as to become a reliable solution in sending confidential messages through SMS although it still has weaknesses in terms of processing time.Keywords: encryption scheme, hybrid cryptosystem, one time password, SMS security
Procedia PDF Downloads 12816937 A Comparative Assessment of Membrane Bioscrubber and Classical Bioscrubber for Biogas Purification
Authors: Ebrahim Tilahun, Erkan Sahinkaya, Bariş Calli̇
Abstract:
Raw biogas is a valuable renewable energy source however it usually needs removal of the impurities. The presence of hydrogen sulfide (H2S) in the biogas has detrimental corrosion effects on the cogeneration units. Removal of H2S from the biogas can therefore significantly improve the biogas quality. In this work, a conventional bioscrubber (CBS), and a dense membrane bioscrubber (DMBS) were comparatively evaluated in terms of H2S removal efficiency (RE), CH4 enrichment and alkaline consumption at gas residence times ranging from 5 to 20 min. Both bioscrubbers were fed with a synthetic biogas containing H2S (1%), CO2 (39%) and CH4 (60%). The results show that high RE (98%) was obtained in the DMBS when gas residence time was 20 min, whereas slightly lower CO2 RE was observed. While in CBS system the outlet H2S concentration was always lower than 250 ppmv, and its H2S RE remained higher than 98% regardless of the gas residence time, although the high alkaline consumption and frequent absorbent replacement limited its cost-effectiveness. The result also indicates that in DMBS when the gas residence time increased to 20 min, the CH4 content in the treated biogas enriched upto 80%. However, while operating the CBS unit the CH4 content of the raw biogas (60%) decreased by three fold. The lower CH4 content in CBS was probably caused by extreme dilution of biogas with air (N2 and O2). According to the results obtained here the DMBS system is a robust and effective biotechnology in comparison with CBS. Hence, DMBS has a better potential for real scale applications.Keywords: biogas, bioscrubber, desulfurization, PDMS membrane
Procedia PDF Downloads 22616936 Radio Frequency Identification (Rfid) Cost-Effective, Location-Based System for Managing Construction Materials
Authors: Mourad Bakouka, Abdelaziz Rabehi
Abstract:
Companies need to have logistics and transportation in place that can adapt to the changing nature of construction sites. This ensures they can react quickly when needed. A study was conducted to develop a way to locate and track materials on construction sites. The system is an RFID/GPS integration that's required to pull off this feat. The study also reports how the platform has been used in construction. They found many advantages to using it, including reductions in both time and costs as well as improved management of materials orders. . For example, the time in which a project could start up was shortened from two weeks to three days with just a single digital order. As of now, the technology is still limited in its widespread adoption due largely to overall lack of awareness and difficulty connecting to it. However, as more and more companies embrace it in construction, the technology is expected to become ubiquitous. The developed platform provides contractors and construction managers with real-time information about the status of materials and work, allowing them to better manage the workflow in a project. The study sheds new light on this subject, which is essential to know. This work is becoming increasingly aware of the use of smart tools in constructing buildings.Keywords: materials management, internet of things (IoT), radio frequency identification (RFID), construction site, supply chain management
Procedia PDF Downloads 8116935 Design and Evaluation of a Fully-Automated Fluidized Bed Dryer for Complete Drying of Paddy
Authors: R. J. Pontawe, R. C. Martinez, N. T. Asuncion, R. V. Villacorte
Abstract:
Drying of high moisture paddy remains a major problem in the Philippines, especially during inclement weather condition. To alleviate the problem, mechanical dryers were used like a flat bed and recirculating batch-type dryers. However, drying to 14% (wet basis) final moisture content is long which takes 10-12 hours and tedious which is not the ideal for handling high moisture paddy. Fully-automated pilot-scale fluidized bed drying system with 500 kilograms per hour capacity was evaluated using a high moisture paddy. The developed fluidized bed dryer was evaluated using four drying temperatures and two variations in fluidization time at a constant airflow, static pressure and tempering period. Complete drying of paddy with ≥28% (w.b.) initial MC was attained after 2 passes of fluidized-bed drying at 2 minutes exposure to 70 °C drying temperature and 4.9 m/s superficial air velocity, followed by 60 min ambient air tempering period (30 min without ventilation and 30 min with air ventilation) for a total drying time of 2.07 h. Around 82% from normal mechanical drying time was saved at 70 °C drying temperature. The drying cost was calculated to be P0.63 per kilogram of wet paddy. Specific heat energy consumption was only 2.84 MJ/kg of water removed. The Head Rice Yield recovery of the dried paddy passed the Philippine Agricultural Engineering Standards. Sensory evaluation showed that the color and taste of the samples dried in the fluidized bed dryer were comparable to air dried paddy. The optimum drying parameters of using fluidized bed dryer is 70 oC drying temperature at 2 min fluidization time, 4.9 m/s superficial air velocity, 10.16 cm grain depth and 60 min ambient air tempering period.Keywords: drying, fluidized bed dryer, head rice yield, paddy
Procedia PDF Downloads 32516934 Ultrasound Assisted Cooling Crystallization of Lactose Monohydrate
Authors: Sanjaykumar R. Patel, Parth R. Kayastha
Abstract:
α-lactose monohydrate is widely used in the pharmaceutical industries as an inactive substance that acts as a vehicle or a medium for a drug or other active substance. It is a byproduct of dairy industries, and the recovery of lactose from whey not only boosts the improvement of the economics of whey utilization but also causes a reduction in pollution as lactose recovery can reduce the BOD of whey by more than 80%. In the present study, levels of process parameters were kept as initial lactose concentration (30-50% w/w), sonication amplitude (20-40%), sonication time (2-6 hours), and crystallization temperature (10-20 oC) for the recovery of lactose in ultrasound assisted cooling crystallization. In comparison with cooling crystallization, the use of ultrasound enhanced the lactose recovery by 39.17% (w/w). The parameters were optimized for the lactose recovery using Taguchi Method. The optimum conditions found were initial lactose concentration at level 3 (50% w/w), amplitude of sonication at level 2 (40%), the sonication time at level 3 (6 hours), and crystallization temperature at level 1 (10 °C). The maximum recovery was found to be 85.85% at the optimum conditions. Sonication time and the initial lactose concentration were found to be significant parameters for the lactose recovery.Keywords: crystallization, lactose, Taguchi method, ultrasound
Procedia PDF Downloads 21216933 A Quinary Coding and Matrix Structure Based Channel Hopping Algorithm for Blind Rendezvous in Cognitive Radio Networks
Authors: Qinglin Liu, Zhiyong Lin, Zongheng Wei, Jianfeng Wen, Congming Yi, Hai Liu
Abstract:
The multi-channel blind rendezvous problem in distributed cognitive radio networks (DCRNs) refers to how users in the network can hop to the same channel at the same time slot without any prior knowledge (i.e., each user is unaware of other users' information). The channel hopping (CH) technique is a typical solution to this blind rendezvous problem. In this paper, we propose a quinary coding and matrix structure-based CH algorithm called QCMS-CH. The QCMS-CH algorithm can guarantee the rendezvous of users using only one cognitive radio in the scenario of the asynchronous clock (i.e., arbitrary time drift between the users), heterogeneous channels (i.e., the available channel sets of users are distinct), and symmetric role (i.e., all users play a same role). The QCMS-CH algorithm first represents a randomly selected channel (denoted by R) as a fixed-length quaternary number. Then it encodes the quaternary number into a quinary bootstrapping sequence according to a carefully designed quaternary-quinary coding table with the prefix "R00". Finally, it builds a CH matrix column by column according to the bootstrapping sequence and six different types of elaborately generated subsequences. The user can access the CH matrix row by row and accordingly perform its channel, hoping to attempt rendezvous with other users. We prove the correctness of QCMS-CH and derive an upper bound on its Maximum Time-to-Rendezvous (MTTR). Simulation results show that the QCMS-CH algorithm outperforms the state-of-the-art in terms of the MTTR and the Expected Time-to-Rendezvous (ETTR).Keywords: channel hopping, blind rendezvous, cognitive radio networks, quaternary-quinary coding
Procedia PDF Downloads 9116932 Crater Pattern on the Moon and Origin of the Moon
Authors: Xuguang Leng
Abstract:
The crater pattern on the Moon indicates the Moon was captured by Earth in the more recent years, disproves the theory that the Moon was born as a satellite to the Earth. The Moon was tidal locked since it became the satellite of the Earth. Moon’s near side is shielded by Earth from asteroid/comet collisions, with the center of the near side most protected. Yet the crater pattern on the Moon is fairly random, with no distinguishable empty spot/strip, no distinguishable difference near side vs. far side. Were the Moon born as Earth’s satellite, there would be a clear crater free spot, or strip should the tial lock shifts over time, on the near side; and far more craters on the far side. The nonexistence of even a vague crater free spot on the near side of the Moon indicates the capture was a more recent event. Given Earth’s much larger mass and sphere size over the Moon, Earth should have collided with asteroids and comets in much higher frequency, resulting in significant mass gain over the lifespan. Earth’s larger mass and magnetic field are better at retaining water and gas from solar wind’s stripping effect, thus accelerating the mass gain. A dwarf planet Moon can be pulled closer and closer to the Earth over time as Earth’s gravity grows stronger, eventually being captured as a satellite. Given enough time, it is possible Earth’s mass would be large enough to cause the Moon to collide with Earth.Keywords: moon, origin, crater, pattern
Procedia PDF Downloads 9716931 Ultrasound Markers in Evaluation of Hernias
Authors: Aniruddha Kulkarni
Abstract:
In very few cases of external hernias we require imaging modalities as on most occasions clinical examination tests are good enough. Ultrasound will help in chronic abdominal or groin pain, equivocal clinical results & complicated hernias. Ultrasound is useful in assessment of cause of raised intrabdominal pressure. In certain cases will comment about etiology, complications and chronicicty of lesion. Screening of rest of abdominal organs too is important advantage being real time modality. Cost effectiveness, no radiation allows modality be used repeatedly in indicated cases. Sonography is better accepted by patients too as it is cost effective. Best advanced tissue harmonic equipment and increasing expertise making it popular. Ultrasound can define surgical anatomy, rent size, contents, etiological /recurrence factors in great detail and with authority hence accidental findings in a planned surgical procedure can be easily avoided. Clinical dynamic valselva and reducibility test can better documented by real time ultrasound study. In case of recurrence, Sonography will help in assessing the hernia details better as being dynamic real time investigation. Ultrasound signs in case of internal hernias are well comparable with CT findings.Keywords: laparoscopic repair, Hernia, CT findings, chronic pain
Procedia PDF Downloads 49716930 Turning Parameters Affect Time up and Go Test Performance in Pre-Frail Community-Dwelling Elderly
Authors: Kuei-Yu Chien, Hsiu-Yu Chiu, Chia-Nan Chen, Shu-Chen Chen
Abstract:
Background: Frailty is associated with decreased physical performances that affect mobility of the elderly. Time up and go test (TUG) was the common method to evaluate mobility in the community. The purpose of this study was to compare the parameters in different stages of Time up and go test (TUG) and physical performance between pre-frail elderly (PFE) and non-frail elderly (NFE). We also investigated the relationship between TUG parameters and physical performance. Methods: Ninety-two community-dwelling older adults were as participants in this study. Based on Canadian Study of Health and Aging Clinical Frailty Scale, 22 older adults were classified as PFE (71.77 ± 6.05 yrs.) and 70 were classified as NFE (71.2 ± 5.02 yrs.). We performed body composition and physical performance, including balance, muscular strength/endurance, mobility, cardiorespiratory endurance, and flexibility. Results: Pre-frail elderly took significantly longer time than NFE in TUG test (p=.004). Pre-frail elderly had lower turning average angular velocity (p = .017), turning peak angular velocity (p = .041) and turning-stand to sit peak angular velocity (p = .037) than NFE. The turning related parameters related to open-eye stand on right foot, 30-second chair stand test, back scratch, and 2-min step tests. Conclusions: Turning average angular velocity, turning peak angular velocity and turning-stand to sit peak angular velocity mainly affected the TUG performance. We suggested that static/dynamic balance, agility, flexibility, and muscle strengthening of lower limbs exercise were important to PFE.Keywords: mobility, aglity, active ageing, functional fitness
Procedia PDF Downloads 18616929 A Constitutive Model for Time-Dependent Behavior of Clay
Authors: T. N. Mac, B. Shahbodaghkhan, N. Khalili
Abstract:
A new elastic-viscoplastic (EVP) constitutive model is proposed for the analysis of time-dependent behavior of clay. The proposed model is based on the bounding surface plasticity and the concept of viscoplastic consistency framework to establish continuous transition from plasticity to rate dependent viscoplasticity. Unlike the overstress based models, this model will meet the consistency condition in formulating the constitutive equation for EVP model. The procedure of deriving the constitutive relationship is also presented. Simulation results and comparisons with experimental data are then presented to demonstrate the performance of the model.Keywords: bounding surface, consistency theory, constitutive model, viscosity
Procedia PDF Downloads 49216928 Effectiveness of ATMS (Advanced Transport Management Systems) in Asuncion, Paraguay
Authors: Sung Ho Oh
Abstract:
The advanced traffic lights, the system of traffic information collection and provision, the CCTVs for traffic control, and the traffic information center were installed in Asuncion, capital of Paraguay. After pre-post comparison of the installation, significant changes were found. Even though the traffic volumes were increased, travel speed was higher, so that travel time from origin to destination was decreased. the saving values for travel time, gas cost, and environmental cost are about 47 million US dollars per year. Satisfaction survey results for the installation were presented with statistical significance analysis.Keywords: advanced transport management systems, effectiveness, Paraguay, traffic lights
Procedia PDF Downloads 35216927 Undernutrition Among Children Below Five Years of Age in Uganda: A Deep Dive into Space and Time
Authors: Vallence Ngabo Maniragaba
Abstract:
This study aimed at examining the variations of undernutrition among children below 5 years of age in Uganda. The approach of spatial and spatiotemporal analysis helped in identifying cluster patterns, hot spots and emerging hot spots. Data from the 6 Uganda Demographic and Health Surveys spanning from 1990 to 2016 were used with the main outcome variable being undernutrition among children <5 years of age. All data that were relevant to this study were retrieved from the survey datasets and combined with the 214 shape files for the districts of Uganda to enable spatial and spatiotemporal analysis. Spatial maps with the spatial distribution of the prevalence of undernutrition, both in space and time, were generated using ArcGIS Pro version 2.8. Moran’s I, an index of spatial autocorrelation, rules out doubts of spatial randomness in order to identify spatially clustered patterns of hot or cold spot areas. Furthermore, space-time cubes were generated to establish the trend in undernutrition as well as to mirror its variations over time and across Uganda. Moreover, emerging hot spot analysis was done to help identify the patterns of undernutrition over time. The results indicate a heterogeneous distribution of undernutrition across Uganda and the same variations were also evident over time. Moran’s I index confirmed spatial clustered patterns as opposed to random distributions of undernutrition prevalence. Four hot spot areas, namely; the Karamoja, the Sebei, the West Nile and the Toro regions were significantly evident, most of the central parts of Uganda were identified as cold spot clusters, while most of Western Uganda, the Acholi and the Lango regions had no statistically significant spatial patterns by the year 2016. The spatio-temporal analysis identified the Karamoja and Sebei regions as clusters of persistent, consecutive and intensifying hot spots, West Nile region was identified as a sporadic hot spot area while the Toro region was identified with both sporadic and emerging hotspots. In conclusion, undernutrition is a silent pandemic that needs to be handled with both hands. At 31.2 percent, the prevalence is still very high and unpleasant. The distribution across the country is nonuniform with some areas such as the Karamoja, the West Nile, the Sebei and the Toro regions being epicenters of undernutrition in Uganda. Over time, the same areas have experienced and exhibited high undernutrition prevalence. Policymakers, as well as the implementers, should bear in mind the spatial variations across the country and prioritize hot spot areas in order to have efficient, timely and region-specific interventions.Keywords: undernutrition, spatial autocorrelation, hotspots analysis, geographically weighted regressions, emerging hotspots analysis, under-fives, Uganda
Procedia PDF Downloads 8616926 A Dynamical Approach for Relating Energy Consumption to Hybrid Inventory Level in the Supply Chain
Authors: Benga Ebouele, Thomas Tengen
Abstract:
Due to long lead time, work in process (WIP) inventory can manifest within the supply chain of most manufacturing system. It implies that there are lesser finished good on hand and more in the process because the work remains in the factory too long and cannot be sold to either customers The supply chain of most manufacturing system is then considered as inefficient as it take so much time to produce the finished good. Time consumed in each operation of the supply chain has an associated energy costs. Such phenomena can be harmful for a hybrid inventory system because a lot of space to store these semi-finished goods may be needed and one is not sure about the final energy cost of producing, holding and delivering the good to customers. The principle that reduces waste of energy within the supply chain of most manufacturing firms should therefore be available to all inventory managers in pursuit of profitability. Decision making by inventory managers in this condition is a modeling process, whereby a dynamical approach is used to depict, examine, specify and even operationalize the relationship between energy consumption and hybrid inventory level. The relationship between energy consumption and inventory level is established, which indicates a poor level of control and hence a potential for energy savings.Keywords: dynamic modelling, energy used, hybrid inventory, supply chain
Procedia PDF Downloads 26816925 Optimal Allocation of Multiple Emergency Resources for a Single Potential Accident Node: A Mixed Integer Linear Program
Authors: Yongjian Du, Jinhua Sun, Kim M. Liew, Huahua Xiao
Abstract:
Optimal allocation of emergency resources before a disaster is of great importance for emergency response. In reality, the pre-protection for a single critical node where accidents may occur is common. In this study, a model is developed to determine location and inventory decisions of multiple emergency resources among a set of candidate stations to minimize the total cost based on the constraints of budgetary and capacity. The total cost includes the economic accident loss which is accorded with probability distribution of time and the warehousing cost of resources which is increasing over time. A ratio is set to measure the degree of a storage station only serving the target node that becomes larger with the decrease of the distance between them. For the application of linear program, it is assumed that the length of travel time to the accident scene of emergency resources has a linear relationship with the economic accident loss. A computational experiment is conducted to illustrate how the proposed model works, and the results indicate its effectiveness and practicability.Keywords: emergency response, integer linear program, multiple emergency resources, pre-allocation decisions, single potential accident node
Procedia PDF Downloads 15416924 One-Step Time Series Predictions with Recurrent Neural Networks
Authors: Vaidehi Iyer, Konstantin Borozdin
Abstract:
Time series prediction problems have many important practical applications, but are notoriously difficult for statistical modeling. Recently, machine learning methods have been attracted significant interest as a practical tool applied to a variety of problems, even though developments in this field tend to be semi-empirical. This paper explores application of Long Short Term Memory based Recurrent Neural Networks to the one-step prediction of time series for both trend and stochastic components. Two types of data are analyzed - daily stock prices, that are often considered to be a typical example of a random walk, - and weather patterns dominated by seasonal variations. Results from both analyses are compared, and reinforced learning framework is used to select more efficient between Recurrent Neural Networks and more traditional auto regression methods. It is shown that both methods are able to follow long-term trends and seasonal variations closely, but have difficulties with reproducing day-to-day variability. Future research directions and potential real world applications are briefly discussed.Keywords: long short term memory, prediction methods, recurrent neural networks, reinforcement learning
Procedia PDF Downloads 22916923 Earthquake Forecasting Procedure Due to Diurnal Stress Transfer by the Core to the Crust
Authors: Hassan Gholibeigian, Kazem Gholibeigian
Abstract:
In this paper, our goal is determination of loading versus time in crust. For this goal, we present a computational procedure to propose a cumulative strain energy time profile which can be used to predict the approximate location and time of the next major earthquake (M > 4.5) along a specific fault, which we believe, is more accurate than many of the methods presently in use. In the coming pages, after a short review of the research works presently going on in the area of earthquake analysis and prediction, earthquake mechanisms in both the jerk and sequence earthquake direction is discussed, then our computational procedure is presented using differential equations of equilibrium which govern the nonlinear dynamic response of a system of finite elements, modified with an extra term to account for the jerk produced during the quake. We then employ Von Mises developed model for the stress strain relationship in our calculations, modified with the addition of an extra term to account for thermal effects. For calculation of the strain energy the idea of Pulsating Mantle Hypothesis (PMH) is used. This hypothesis, in brief, states that the mantle is under diurnal cyclic pulsating loads due to unbalanced gravitational attraction of the sun and the moon. A brief discussion is done on the Denali fault as a case study. The cumulative strain energy is then graphically represented versus time. At the end, based on some hypothetic earthquake data, the final results are verified.Keywords: pulsating mantle hypothesis, inner core’s dislocation, outer core’s bulge, constitutive model, transient hydro-magneto-thermo-mechanical load, diurnal stress, jerk, fault behaviour
Procedia PDF Downloads 276