Search results for: accumulated survey error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7089

Search results for: accumulated survey error

6819 Cellular Traffic Prediction through Multi-Layer Hybrid Network

Authors: Supriya H. S., Chandrakala B. M.

Abstract:

Deep learning based models have been recently successful adoption for network traffic prediction. However, training a deep learning model for various prediction tasks is considered one of the critical tasks due to various reasons. This research work develops Multi-Layer Hybrid Network (MLHN) for network traffic prediction and analysis; MLHN comprises the three distinctive networks for handling the different inputs for custom feature extraction. Furthermore, an optimized and efficient parameter-tuning algorithm is introduced to enhance parameter learning. MLHN is evaluated considering the “Big Data Challenge” dataset considering the Mean Absolute Error, Root Mean Square Error and R^2as metrics; furthermore, MLHN efficiency is proved through comparison with a state-of-art approach.

Keywords: MLHN, network traffic prediction

Procedia PDF Downloads 59
6818 Whole Body Vibration and Low Back Disorder among Saskatchewan Farmers: A Prospective Cohort Study

Authors: Samuel Kwaku Essien, Catherine Trask, Niels Koehncke, Brenna Bath

Abstract:

Background: Low back disorder (LBD) is the most common musculoskeletal problem among farmers, with higher prevalence than other occupations. Operators of tractors and other farm machinery such as combines or all-terrain vehicles (ATV) can have considerable cumulative exposure to whole body vibration (WBV). Although there appears to be an association between LBD and WBV, lack of prospective studies makes the relationship between LBD and WBV unclear. Purpose: This study investigates the association between WBV and LBD among Saskatchewan farmers using a prospective cohort study Methods: The Saskatchewan Farm Injury Cohort Study Phase I (2007) and II (2013) data were used. Baseline data were collected via postal questionnaire on accumulated yearly tractor, combine, and ATV use as well as several covariates to support a biopsychosocial model of LBD. Follow-up data on musculoskeletal symptoms were collected for the 6-year with sample size of 1149. Questions on ‘low back trouble’ (ache, pain, discomfort) experienced in the last 12 months answered by farmer participants as ‘yes’ or ‘no’. A GEE-modified Poisson approach was performed using SPSS 22 and SAS 9.4. Results: Twelve-month Prevalence of LBD was 59.8%. In multivariate analysis of the 6-year follow-up, LBD was associated with ATV operation and tractor operation, with a dose-response relationship for annual accumulated tractor operation. Although combine operation ≥ 61 hrs/year was related to LBD in bivariate analysis, this difference did not persist after adjustment for confounder. Age was found to be a confounder in relationship between WBV and LBD and no interactions were found. Conclusion: Longer annual tractor operation and older age are important predictors of LBD symptoms in farmers. Future research involving direct measurement can help identify appropriate prevention strategies.

Keywords: agriculture, low back disorder, low back pain, occupational health

Procedia PDF Downloads 302
6817 Improved Performance Scheme for Joint Transmission in Downlink Coordinated Multi-Point Transmission

Authors: Young-Su Ryu, Su-Hyun Jung, Myoung-Jin Kim, Hyoung-Kyu Song

Abstract:

In this paper, improved performance scheme for joint transmission is proposed in downlink (DL) coordinated multi-point(CoMP) in case of constraint transmission power. This scheme is that serving transmission point (TP) request a joint transmission to inter-TP and selects one pre-coding technique according to channel state information(CSI) from user equipment(UE). The simulation results show that the bit error rate(BER) and throughput performances of the proposed scheme provide high spectral efficiency and reliable data at the cell edge.

Keywords: CoMP, joint transmission, minimum mean square error, zero-forcing, zero-forcing dirty paper coding

Procedia PDF Downloads 530
6816 Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan

Authors: April Hueimin Lu, Liangj-Ju Yao, Jun-Tin Lin, Susan Siru Liu

Abstract:

This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage.

Keywords: aerial survey, 3D scanning, aboriginal settlement, settlement architecture cluster, ecological landscape area, old Paiwan settlements, slat house, photogrammetry, SfM, MVS), Point cloud, SIFT, DSM, 3D model

Procedia PDF Downloads 131
6815 Identification of Architectural Design Error Risk Factors in Construction Projects Using IDEF0 Technique

Authors: Sahar Tabarroki, Ahad Nazari

Abstract:

The design process is one of the most key project processes in the construction industry. Although architects have the responsibility to produce complete, accurate, and coordinated documents, architectural design is accompanied by many errors. A design error occurs when the constraints and requirements of the design are not satisfied. Errors are potentially costly and time-consuming to correct if not caught early during the design phase, and they become expensive in either construction documents or in the construction phase. The aim of this research is to identify the risk factors of architectural design errors, so identification of risks is necessary. First, a literature review in the design process was conducted and then a questionnaire was designed to identify the risks and risk factors. The questions in the form of the questionnaire were based on the “similar service description of study and supervision of architectural works” published by “Vice Presidency of Strategic Planning & Supervision of I.R. Iran” as the base of architects’ tasks. Second, the top 10 risks of architectural activities were identified. To determine the positions of possible causes of risks with respect to architectural activities, these activities were located in a design process modeled by the IDEF0 technique. The research was carried out by choosing a case study, checking the design drawings, interviewing its architect and client, and providing a checklist in order to identify the concrete examples of architectural design errors. The results revealed that activities such as “defining the current and future requirements of the project”, “studies and space planning,” and “time and cost estimation of suggested solution” has a higher error risk than others. Moreover, the most important causes include “unclear goals of a client”, “time force by a client”, and “lack of knowledge of architects about the requirements of end-users”. For error detecting in the case study, lack of criteria, standards and design criteria, and lack of coordination among them, was a barrier, anyway, “lack of coordination between architectural design and electrical and mechanical facility”, “violation of the standard dimensions and sizes in space designing”, “design omissions” were identified as the most important design errors.

Keywords: architectural design, design error, risk management, risk factor

Procedia PDF Downloads 107
6814 Developing a Culturally Acceptable End of Life Survey (the VOICES-ESRD/Thai Questionnaire) for Evaluation Health Services Provision of Older Persons with End-Stage Renal Disease (ESRD) in Thailand

Authors: W. Pungchompoo, A. Richardson, L. Brindle

Abstract:

Background: The developing of a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire) is an essential instrument for evaluation health services provision of older persons with ESRD in Thailand. The focus of the questionnaire was on symptoms, symptom control and the health care needs of older people with ESRD who are managed without dialysis. Objective: The objective of this study was to develop and adapt VOICES to make it suitable for use in a population survey in Thailand. Methods: The mixed methods exploratory sequential design was focussed on modifying an instrument. Data collection: A cognitive interviewing technique was implemented, using two cycles of data collection with a sample of 10 bereaved carers and a prototype of the Thai VOICES questionnaire. Qualitative study was used to modify the developing a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire). Data analysis: The data were analysed by using content analysis. Results: The revisions to the prototype questionnaire were made. The results were used to adapt the VOICES questionnaire for use in a population-based survey with older ESRD patients in Thailand. Conclusions: A culturally specific questionnaire was generated during this second phase and issues with questionnaire design were rectified.

Keywords: VOICES-ESRD/Thai questionnaire, cognitive interviewing, end of life survey, health services provision, older persons with ESRD

Procedia PDF Downloads 264
6813 Feature Location Restoration for Under-Sampled Photoplethysmogram Using Spline Interpolation

Authors: Hangsik Shin

Abstract:

The purpose of this research is to restore the feature location of under-sampled photoplethysmogram using spline interpolation and to investigate feasibility for feature shape restoration. We obtained 10 kHz-sampled photoplethysmogram and decimated it to generate under-sampled dataset. Decimated dataset has 5 kHz, 2.5 k Hz, 1 kHz, 500 Hz, 250 Hz, 25 Hz and 10 Hz sampling frequency. To investigate the restoration performance, we interpolated under-sampled signals with 10 kHz, then compared feature locations with feature locations of 10 kHz sampled photoplethysmogram. Features were upper and lower peak of photplethysmography waveform. Result showed that time differences were dramatically decreased by interpolation. Location error was lesser than 1 ms in both feature types. In 10 Hz sampled cases, location error was also deceased a lot, however, they were still over 10 ms.

Keywords: peak detection, photoplethysmography, sampling, signal reconstruction

Procedia PDF Downloads 342
6812 Maximum Initial Input Allowed to Iterative Learning Control Set-up Using Singular Values

Authors: Naser Alajmi, Ali Alobaidly, Mubarak Alhajri, Salem Salamah, Muhammad Alsubaie

Abstract:

Iterative Learning Control (ILC) known to be a controlling tool to overcome periodic disturbances for repetitive systems. This technique is required to let the error signal tends to zero as the number of operation increases. The learning process that lies within this context is strongly dependent on the initial input which if selected properly tends to let the learning process be more effective compared to the case where a system starts from blind. ILC uses previous recorded execution data to update the following execution/trial input such that a reference trajectory is followed to a high accuracy. Error convergence in ILC is generally highly dependent on the input applied to a plant for trial $1$, thus a good choice of initial starting input signal would make learning faster and as a consequence the error tends to zero faster as well. In the work presented within, an upper limit based on the Singular Values Principle (SV) is derived for the initial input signal applied at trial $1$ such that the system follow the reference in less number of trials without responding aggressively or exceeding the working envelope where a system is required to move within in a robot arm, for example. Simulation results presented illustrate the theory introduced within this paper.

Keywords: initial input, iterative learning control, maximum input, singular values

Procedia PDF Downloads 215
6811 The Non-Existence of Perfect 2-Error Correcting Lee Codes of Word Length 7 over Z

Authors: Catarina Cruz, Ana Breda

Abstract:

Tiling problems have been capturing the attention of many mathematicians due to their real-life applications. In this study, we deal with tilings of Zⁿ by Lee spheres, where n is a positive integer number, being these tilings related with error correcting codes on the transmission of information over a noisy channel. We focus our attention on the question ‘for what values of n and r does the n-dimensional Lee sphere of radius r tile Zⁿ?’. It seems that the n-dimensional Lee sphere of radius r does not tile Zⁿ for n ≥ 3 and r ≥ 2. Here, we prove that is not possible to tile Z⁷ with Lee spheres of radius 2 presenting a proof based on a combinatorial method and faithful to the geometric idea of the problem. The non-existence of such tilings has been studied by several authors being considered the most difficult cases those in which the radius of the Lee spheres is equal to 2. The relation between these tilings and error correcting codes is established considering the center of a Lee sphere as a codeword and the other elements of the sphere as words which are decoded by the central codeword. When the Lee spheres of radius r centered at elements of a set M ⊂ Zⁿ tile Zⁿ, M is a perfect r-error correcting Lee code of word length n over Z, denoted by PL(n, r). Our strategy to prove the non-existence of PL(7, 2) codes are based on the assumption of the existence of such code M. Without loss of generality, we suppose that O ∈ M, where O = (0, ..., 0). In this sense and taking into account that we are dealing with Lee spheres of radius 2, O covers all words which are distant two or fewer units from it. By the definition of PL(7, 2) code, each word which is distant three units from O must be covered by a unique codeword of M. These words have to be covered by codewords which dist five units from O. We prove the non-existence of PL(7, 2) codes showing that it is not possible to cover all the referred words without superposition of Lee spheres whose centers are distant five units from O, contradicting the definition of PL(7, 2) code. We achieve this contradiction by combining the cardinality of particular subsets of codewords which are distant five units from O. There exists an extensive literature on codes in the Lee metric. Here, we present a new approach to prove the non-existence of PL(7, 2) codes.

Keywords: Golomb-Welch conjecture, Lee metric, perfect Lee codes, tilings

Procedia PDF Downloads 134
6810 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 647
6809 Banking Sector Development and Economic Growth: Evidence from the State of Qatar

Authors: Fekri Shawtari

Abstract:

The banking sector plays a very crucial role in the economic development of the country. As a financial intermediary, it has assigned a great role in the economic growth and stability. This paper aims to examine the empirically the relationship between banking industry and economic growth in state of Qatar. We adopt the VAR vector error correction model (VECM) along with Granger causality to address the issue over the long-run and short-run between the banking sector and economic growth. It is expected that the results will give policy directions to the policymakers to make strategies that are conducive toward boosting development to achieve the targeted economic growth in current situation.

Keywords: economic growth, banking sector, Qatar, vector error correction model, VECM

Procedia PDF Downloads 147
6808 Virtual Assessment of Measurement Error in the Fractional Flow Reserve

Authors: Keltoum Chahour, Mickael Binois

Abstract:

Due to a lack of standardization during the invasive fractional flow reserve (FFR) procedure, the index is subject to many sources of uncertainties. In this paper, we investigate -through simulation- the effect of the (FFR) device position and configuration on the obtained value of the (FFR) fraction. For this purpose, we use computational fluid dynamics (CFD) in a 3D domain corresponding to a diseased arterial portion. The (FFR) pressure captor is introduced inside it with a given length and coefficient of bending to capture the (FFR) value. To get over the computational limitations, basically, the time of the simulation is about 2h 15min for one (FFR) value; we generate a Gaussian Process (GP) model for (FFR) prediction. The (GP) model indicates good accuracy and demonstrates the effective error in the measurement created by the random configuration of the pressure captor.

Keywords: fractional flow reserve, Gaussian processes, computational fluid dynamics, drift

Procedia PDF Downloads 102
6807 Electronic Patient Record (EPR) System in South Africa: Results of a Pilot Study

Authors: Temitope O. Tokosi, Visvanathan Naicker

Abstract:

Patient health records contain sensitive information for which an electronic patient record (EPR) system can safely secure and transmit amongst clinicians for use in improving health delivery. Clinician’s use of the behaviour of these systems is under scrutiny to assess their attributes towards health technology. South Africa (SA) clinicians responded to a pilot study survey to assess their understanding of EPR, what attributes are important towards technology use and more importantly streamlining the survey for a larger study. Descriptive statistics using mean scores was used because of the small sample size of 11 clinicians who completed the survey. Nine (9) constructs comprising 62 items were used and a Cronbach alpha score of 0.883 was obtained. Limitations and discussions conclude the study.

Keywords: EPR, clinicians, pilot study, South Africa

Procedia PDF Downloads 243
6806 Inter-Departmental Survey to Check the Impact of Bio-Safety Training Sessions among Lab Employees

Authors: Noorulaine Maqsood, Saeed Khan

Abstract:

Background: Concern regarding incident reporting and bio-safety training in clinical laboratories in Pakistan has increased remarkably in the last few years due to rapid increase in diagnosis and research on infectious organisms. In order to ensure the safety of employees, this issue needs to be addressed immediately. Bio-safety training sessions and lectures are necessary for the protection of laboratory workers in order to ensure safe practices and minimize the count of incident reporting in the lab. Objective: To carry out an inter-departmental survey in lab regarding the awareness of bio-safety practices among lab employees before and after conducting bio-safety training sessions. Methodology: We conducted a 30 questions survey of laboratory workers in June 2013 (before training session) to gather information related to bio-safety awareness. Afterwards, we conducted another survey after training sessions and workshops related to bio-safety. Result: The survey regarding bio-safety level showed that before the training session 32% of the participants were aware of bio-safety level being used in their lab whereas after the session this percentage increased to 72%. 48% of the participants had information about the proper usage of PPE which increased to 76%. Awareness regarding proper management of hazardous waste increased from 32% to 64%. The incident reporting practice, sample handling and hand hygiene awareness was previously reported to be 40%, 65%, and 52% that increased to 80%, 85% and 88% respectively after the training session was completed. Conclusion: The first survey results showed lack of awareness that suggest nearly all senior scientists, faculty, medical technologist, lab attendant and housekeeping staff working in laboratories are required to have bio-safety training, and required inspection at least twice a year by a bio-safety officer and also required to renew their bio-safety training. After the training session, significant changes in awareness level and attitude of the participants regarding biosafety practices were observed. Therefore, such bio-safety sessions should be carried out regularly in clinical laboratories.

Keywords: biosafety practices, clinical laboratory, Pakistan, survey

Procedia PDF Downloads 402
6805 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images

Authors: Elham Bagheri, Yalda Mohsenzadeh

Abstract:

Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.

Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception

Procedia PDF Downloads 47
6804 Biomass and Carbon Stock Estimates of Woodlands in the Southeastern Escarpment of Ethiopian Rift Valley: An Implication for Climate Change Mitigation

Authors: Sultan Haji Shube

Abstract:

Woodland ecosystems of semiarid rift valley of Ethiopia play a significant role in climate change mitigation by sequestering and storing more carbon. This study was conducted in Gidabo river sub-basins southeastern rift-valley escarpment of Ethiopian. It aims to estimate biomass and carbon stocks of woodlands and its implications for climate change mitigation. A total of 44 sampling plots (900m²each) were systematically laid in the woodland for vegetation and environmental data collection. A composite soil sample was taken from five locations main plot. Both disturbed and undisturbed soil samples were taken at two depths using soil auger and core-ring sampler, respectively. Allometric equation was used to estimate aboveground biomass while root-to-shoot ratio method and Walkley-Black method were used for belowground biomass and SOC, respectively. Result revealed that the totals of the study site was 17.05t/ha, of which 14.21t/ha was belonging for AGB and 2.84t/ha was for BGB. Moreover, 2224.7t/ha total carbon stocks was accumulated with an equivalent carbon dioxide of 8164.65t/ha. This study also revealed that more carbon was accumulated in the soil than the biomass. Both aboveground and belowground carbon stocks were decreased with increase in altitude while SOC stocks were increased. The AGC and BGC stocks were higher in the lower slope classes. SOC stocks were higher in the higher slope classes than in the lower slopes. Higher carbon stock was obtained from woody plants that had a DBH measure of >16cm and situated at plots facing northwest. Overall, study results will add up information about carbon stock potential of the woodland that will serve as a base line scenario for further research, policy makers and land managers.

Keywords: allometric equation, climate change mitigation, soil organic carbon, woodland

Procedia PDF Downloads 59
6803 The Types of Annuities with Flexible Premium

Authors: Deniz Ünal Özpalamutcu, Burcu Altman

Abstract:

Actuaria uses mathematics, statistic and financial information when analyzing the financial impacts of uncertainties, risks, insurance and pension related issues. In other words, it deals with the likelihood of potential risks, their financial impacts and especially the financial measures. Handling these measures require some long-term payment and investments. So, it is obvious it is inevitable to plan the periodic payments with equal time intervals considering also the changing value of money over time. These series of payment made specific intervals of time is called annuity or rant. In literature, rants are classified based on start and end dates, start times, payments times, payments amount or frequency. Classification of rants based on payment amounts changes based on the constant, descending or ascending payment methods. The literature about handling the annuity is very limited. Yet in a daily life, especially in today’s world where the economic issues gained a prominence, it is very crucial to use the variable annuity method in line with the demands of the customers. In this study, the types of annuities with flexible payment are discussed. In other words, we focus on calculating payment amount of a period by adding a certain percentage of previous period payment was studied. While studying this problem, formulas were created considering both start and end period payments for cash value and accumulated. Also increase of each period payment by r interest rate each period payments calculated with previous periods increases. And the problem of annuities (rants) of which each period payment increased with previous periods’ increase by r interest rate has been analyzed. Cash value and accumulated value calculation of this problem were studied separately based on the period start/end and their relations were expressed by formulas.

Keywords: actuaria, annuity, flexible payment, rant

Procedia PDF Downloads 188
6802 Financial Inclusion for Inclusive Growth in an Emerging Economy

Authors: Godwin Chigozie Okpara, William Chimee Nwaoha

Abstract:

The paper set out to stress on how financial inclusion index could be calculated and also investigated the impact of inclusive finance on inclusive growth in an emerging economy. In the light of these objectives, chi-wins method was used to calculate indexes of financial inclusion while co-integration and error correction model were used for evaluation of the impact of financial inclusion on inclusive growth. The result of the analysis revealed that financial inclusion while having a long-run relationship with GDP growth is an insignificant function of the growth of the economy. The speed of adjustment is correctly signed and significant. On the basis of these results, the researchers called for tireless efforts of government and banking sector in promoting financial inclusion in developing countries.

Keywords: chi-wins index, co-integration, error correction model, financial inclusion

Procedia PDF Downloads 629
6801 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data

Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini

Abstract:

A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.

Keywords: central Italy, extreme events, rainfall data, underestimation errors

Procedia PDF Downloads 165
6800 MCERTL: Mutation-Based Correction Engine for Register-Transfer Level Designs

Authors: Khaled Salah

Abstract:

In this paper, we present MCERTL (mutation-based correction engine for RTL designs) as an automatic error correction technique based on mutation analysis. A mutation-based correction methodology is proposed to automatically fix the erroneous RTL designs. The proposed strategy combines the processes of mutation and assertion-based localization. The erroneous statements are mutated to produce possible fixes for the failed RTL code. A concurrent mutation engine is proposed to mitigate the computational cost of running sequential mutants operators. The proposed methodology is evaluated against some benchmarks. The experimental results demonstrate that our proposed method enables us to automatically locate and correct multiple bugs at reasonable time.

Keywords: bug localization, error correction, mutation, mutants

Procedia PDF Downloads 253
6799 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions

Authors: Hannah F. Opayinka, Adedayo A. Adepoju

Abstract:

This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.

Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution

Procedia PDF Downloads 161
6798 The Study of Information Uses Behaviour of Tourists in Songkhla Province, Thailand

Authors: Patraporn Kaewkhanitarak, Suchada Srichuar, Narawat Kanjanapan

Abstract:

This research is the survey research. The purpose of this research is to study information uses behavior and problem of tourists in Songkhla Province. The tool used in this study include structure questioner standardize in 5 levels rating scale. The 400 participants selected by convenience sampling (allowable error 5%) by Taro Yamane method. The collecting data period is 6 months from January-June 2014. The result of this study found that the type of information that the tourists often use to plan their trip is internet (x̅ = 3.81) and the most popular text is restaurant (x̅ = 3.77). The tourists found that booking or buying service from internet provided more affordable price and they could select appropriate plan by themselves. The most convenience source of information that the tourists often use is internet and website (x̅ = 3.69). Nevertheless, they explained that most of tourist information source in Songkhla province are lack and insufficient of tourist organization that provide information and service related to tourism.

Keywords: information, behavior, tourists, Thailand

Procedia PDF Downloads 229
6797 Tourism Potential of Kyrgyzstan and Contribution of Ethics to It's Tourism Growth

Authors: Halil Koch

Abstract:

In this article, besides the current tourism potential of Kyrgyzstan, the factors that may affect the tourism potential of Kyrgyzstan were discussed. Kyrgyzstan is a unique country that can offer quite different alternatives for tourism with its unique nature, lakes, mountains, history, and rich culture. Despite having so many alternatives, today, Kyrgyzstan cannot use this unique wealth as it should. This article tried to deal with matters that can increase the tourism potential of Kyrgyzstan. In addition, the contribution of ethical rules to the tourism potential of Kyrgyzstan was discussed. A detailed literature review was carried out on the tourism industry and tourism potential of Kyrgyzstan. After the literature review, a survey was conducted with the businesses and employees of touristic businesses in the Issyk Kul region of Kyrgyzstan in order to determine the factors that might improve the tourism potential and the effect of ethical rules on the tourism of Kyrgyzstan. 100 people participated in the survey. According to the results of the survey, the participants of the survey think that the culture, touristic richness, and unique nature of Kyrgyzstan are not promoted effectively. Participants think that Kyrgyzstan's tourism capacity will increase with the effective implementation of ethical rules as well as the effective promotion of Kyrgyzstan's cultural and natural wealth. According to the results of the survey, participants think that the tourism sector in Kyrgyzstan will develop rapidly if the ethical rules are followed as much as possible from the first moment that the tourists who come to the country set foot in the country. Participants predict that ethical rules have a tremendous impact on Kyrgyzstan tourism. It has been revealed that there is no systematic approach to ethical rules.

Keywords: tourism, ethics, growth, economy

Procedia PDF Downloads 93
6796 Comparison between Separable and Irreducible Goppa Code in McEliece Cryptosystem

Authors: Newroz Nooralddin Abdulrazaq, Thuraya Mahmood Qaradaghi

Abstract:

The McEliece cryptosystem is an asymmetric type of cryptography based on error correction code. The classical McEliece used irreducible binary Goppa code which considered unbreakable until now especially with parameter [1024, 524, and 101], but it is suffering from large public key matrix which leads to be difficult to be used practically. In this work Irreducible and Separable Goppa codes have been introduced. The Irreducible and Separable Goppa codes used are with flexible parameters and dynamic error vectors. A Comparison between Separable and Irreducible Goppa code in McEliece Cryptosystem has been done. For encryption stage, to get better result for comparison, two types of testing have been chosen; in the first one the random message is constant while the parameters of Goppa code have been changed. But for the second test, the parameters of Goppa code are constant (m=8 and t=10) while the random message have been changed. The results show that the time needed to calculate parity check matrix in separable are higher than the one for irreducible McEliece cryptosystem, which is considered expected results due to calculate extra parity check matrix in decryption process for g2(z) in separable type, and the time needed to execute error locator in decryption stage in separable type is better than the time needed to calculate it in irreducible type. The proposed implementation has been done by Visual studio C#.

Keywords: McEliece cryptosystem, Goppa code, separable, irreducible

Procedia PDF Downloads 239
6795 Food Insecurity Determinants Amidst the Covid-19 Pandemic: An Insight from Huntsville, Texas

Authors: Peter Temitope Agboola

Abstract:

Food insecurity continues to affect a large number of U.S households during this coronavirus COVID-19 pandemic. The pandemic has threatened the livelihoods of people, making them vulnerable to severe hardship and has had an unanticipated impact on the U.S economy. This study attempts to identify the food insecurity status of households and the determinant factors driving household food insecurity. Additionally, it attempts to discover the mitigation measures adopted by households during the pandemic in the city of Huntsville, Texas. A structured online sample survey was used to collect data, with a household expenditures survey used in evaluating the food security status of the household. Most survey respondents disclosed that the COVID-19 pandemic had affected their life and source of income. Furthermore, the main analytical tool used for the study is descriptive statistics and logistic regression modeling. A logistic regression model was used to determine the factors responsible for food insecurity in the study area. The result revealed that most households in the study area are food secure, with the remainder being food insecure.

Keywords: food insecurity, household expenditure survey, COVID-19, coping strategies, food pantry

Procedia PDF Downloads 183
6794 Selection of Rayleigh Damping Coefficients for Seismic Response Analysis of Soil Layers

Authors: Huai-Feng Wang, Meng-Lin Lou, Ru-Lin Zhang

Abstract:

One good analysis method in seismic response analysis is direct time integration, which widely adopts Rayleigh damping. An approach is presented for selection of Rayleigh damping coefficients to be used in seismic analyses to produce a response that is consistent with Modal damping response. In the presented approach, the expression of the error of peak response, acquired through complete quadratic combination method, and Rayleigh damping coefficients was set up and then the coefficients were produced by minimizing the error. Two finite element modes of soil layers, excited by 28 seismic waves, were used to demonstrate the feasibility and validity.

Keywords: Rayleigh damping, modal damping, damping coefficients, seismic response analysis

Procedia PDF Downloads 416
6793 Media Façades in the Wild: Some Lessons

Authors: Hai-Ning Liang, Xiaowei Dai, Nancy Diniz, Charles Fleming, Woon Kian Chong

Abstract:

Media displays in public areas are becoming increasingly pervasive—they are used in many settings, come in different sizes, serve different purposes, and have varied degrees of interactivity. In this paper, we aim to provide a survey of how these displays, often named media façades, are used in the wild in a city in China which is undergoing a rapid growth. This survey is intended to raise greater awareness and discussion about the use and effect of these displays in public areas. Through this survey, we have been able to distill some lessons of what is good, bad, and ugly about some current examples of media displays used in a city that is transitioning into becoming a modern one and one that is located in one of the fastest growing areas in Asia. With this research, we hope that we can provide technology designers and architects with some general principles that can help them integrate these types of technologies into their architectural creations.

Keywords: large displays, media façades, interaction design, architectural displays

Procedia PDF Downloads 372
6792 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 66
6791 Airport Pavement Crack Measurement Systems and Crack Density for Pavement Evaluation

Authors: Ali Ashtiani, Hamid Shirazi

Abstract:

This paper reviews the status of existing practice and research related to measuring pavement cracking and using crack density as a pavement surface evaluation protocol. Crack density for pavement evaluation is currently not widely used within the airport community and its use by the highway community is limited. However, surface cracking is a distress that is closely monitored by airport staff and significantly influences the development of maintenance, rehabilitation and reconstruction plans for airport pavements. Therefore crack density has the potential to become an important indicator of pavement condition if the type, severity and extent of surface cracking can be accurately measured. A pavement distress survey is an essential component of any pavement assessment. Manual crack surveying has been widely used for decades to measure pavement performance. However, the accuracy and precision of manual surveys can vary depending upon the surveyor and performing surveys may disrupt normal operations. Given the variability of manual surveys, this method has shown inconsistencies in distress classification and measurement. This can potentially impact the planning for pavement maintenance, rehabilitation and reconstruction and the associated funding strategies. A substantial effort has been devoted for the past 20 years to reduce the human intervention and the error associated with it by moving toward automated distress collection methods. The automated methods refer to the systems that identify, classify and quantify pavement distresses through processes that require no or very minimal human intervention. This principally involves the use of a digital recognition software to analyze and characterize pavement distresses. The lack of established protocols for measurement and classification of pavement cracks captured using digital images is a challenge to developing a reliable automated system for distress assessment. Variations in types and severity of distresses, different pavement surface textures and colors and presence of pavement joints and edges all complicate automated image processing and crack measurement and classification. This paper summarizes the commercially available systems and technologies for automated pavement distress evaluation. A comprehensive automated pavement distress survey involves collection, interpretation, and processing of the surface images to identify the type, quantity and severity of the surface distresses. The outputs can be used to quantitatively calculate the crack density. The systems for automated distress survey using digital images reviewed in this paper can assist the airport industry in the development of a pavement evaluation protocol based on crack density. Analysis of automated distress survey data can lead to a crack density index. This index can be used as a means of assessing pavement condition and to predict pavement performance. This can be used by airport owners to determine the type of pavement maintenance and rehabilitation in a more consistent way.

Keywords: airport pavement management, crack density, pavement evaluation, pavement management

Procedia PDF Downloads 167
6790 A Secure Survey against Black Hole Attack in MANET

Authors: G. Usha, S. Kannimuthu, K. Mahalakshmi

Abstract:

Mobile Adhoc Network (MANET) is one of the most promising technologies that have applications ranging from various portable devices to military networks. MANET has no fixed infrastructure and the security of such network is a big concern. Therefore, in order to operate MANET’s securely, the misbehavior and intrusions should be detected before the attackers affect the network communication. In this article, we make a comprehensive survey against black hole attack that is a serious threat against MANET that exploits the routing behavior of the MANET. We have given broad survey solutions that detect black hole attacks in MANET. This is achieved by analyzing the techniques involved in detecting the attacks in each scheme. Furthermore, we examine about the challenges to the researchers for constructing an in-depth solution against black hole attack.

Keywords: AODV, cross layer security, mobile Adhoc network (MANET), packet delivery ratio, single layer security

Procedia PDF Downloads 384