Search results for: λ-levelwise statistical convergence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4388

Search results for: λ-levelwise statistical convergence

4178 Statistical Scientific Investigation of Popular Cultural Heritage in the Relationship between Astronomy and Weather Conditions in the State of Kuwait

Authors: Ahmed M. AlHasem

Abstract:

The Kuwaiti society has long been aware of climatic changes and their annual dates and trying to link them to astronomy in an attempt to forecast the future weather conditions. The reason for this concern is that many of the economic, social and living activities of the society depend deeply on the nature of the weather conditions directly and indirectly. In other words, Kuwaiti society, like the case of many human societies, has in the past tried to predict climatic conditions by linking them to astronomy or popular statements to indicate the timing of climate changes. Accordingly, this study was devoted to scientific investigation based on the statistical analysis of climatic data to show the accuracy and compatibility of some of the most important elements of the cultural heritage in relation to climate change and to relate it scientifically to precise climatic measurements for decades. The research has been divided into 10 topics, each topic has been focused on one legacy, whether by linking climate changes to the appearance/disappearance of star or a popular statement inherited through generations, through explain the nature and timing and thereby statistical analysis to indicate the proportion of accuracy based on official climatic data since 1962. The study's conclusion is that the relationship is weak and, in some cases, non-existent between the popular heritage and the actual climatic data. Therefore, it does not have a dependable relationship and a reliable scientific prediction between both the popular heritage and the forecast of weather conditions.

Keywords: astronomy, cultural heritage, statistical analysis, weather prediction

Procedia PDF Downloads 91
4177 Proteome-Wide Convergent Evolution on Vocal Learning Birds Reveals Insight into cAMP-Based Learning Pathway

Authors: Chul Lee, Seoae Cho, Erich D. Jarvis, Heebal Kim

Abstract:

Vocal learning, the ability to imitate vocalizations based on auditory experience, is a homoplastic character state observed in different independent lineages of animals such as songbirds, parrots, hummingbirds and human. It has now become possible to perform genome-wide molecular analyses across vocal learners and vocal non-learners with the recent expansion of avian genome data. It was analyzed the whole genomes of human and 48 avian species including those belonging to the three avian vocal learning lineages, to determine if behavior and neural convergence are associated with molecular convergence in divergent species of vocal learners. Analyses of 8295 orthologous genes across bird species revealed 141 genes with amino acid substitutions specific to vocal learners. Out of these, 25 genes have vocal learner specific genetic homoplasies, and their functions were enriched for learning. Several sites in these genes are estimated under convergent evolution and positive selection. A potential role for a subset of these genes in vocal learning was supported by associations with gene expression profiles in vocal learning brain regions of songbirds and human disease that cause language dysfunctions. The key candidate gene with multiple independent lines of the evidences specific to vocal learners was DRD5. Our findings suggest cAMP-based learning pathway in avian vocal learners, indicating molecular homoplastic changes associated with a complex behavioral trait, vocal learning.

Keywords: amino acid substitutions, convergent evolution, positive selection, vocal learning

Procedia PDF Downloads 304
4176 Impact of Gaming Environment in Education

Authors: Md. Ataur Rahman Bhuiyan, Quazi Mahabubul Hasan, Md. Rifat Ullah

Abstract:

In this research, we did explore the effectiveness of the gaming environment in education and compared it with the traditional education system. We take several workshops in both learning environments. We measured student’s performance by providing a grading score (by professional academics) on their attitude in different criteria. We also collect data from survey questionnaires to understand student’s experiences towards education and study. Finally, we examine the impact of the different learning environments by applying statistical hypothesis tests, the T-test, and the ANOVA test.

Keywords: gamification, game-based learning, education, statistical analysis, human-computer interaction

Procedia PDF Downloads 187
4175 A Statistical Energy Analysis Model of an Automobile for the Prediction of the Internal Sound Pressure Level

Authors: El Korchi Ayoub, Cherif Raef

Abstract:

Interior noise in vehicles is an essential factor affecting occupant comfort. Over recent decades, much work has been done to develop simulation tools for vehicle NVH. At the medium high-frequency range, the statistical energy analysis method (SEA) shows significant effectiveness in predicting noise and vibration responses of mechanical systems. In this paper, the evaluation of the sound pressure level (SPL) inside an automobile cabin has been performed numerically using the statistical energy analysis (SEA) method. A test car cabin was performed using a monopole source as a sound source. The decay rate method was employed to obtain the damping loss factor (DLF) of each subsystem of the developed SEA model. These parameters were then used to predict the sound pressure level in the interior cabin. The results show satisfactory agreement with the directly measured SPL. The developed SEA vehicle model can be used in early design phases and allows the engineer to identify sources contributing to the total noise and transmission paths.

Keywords: SEA, SPL, DLF, NVH

Procedia PDF Downloads 55
4174 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods

Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno

Abstract:

Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.

Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management

Procedia PDF Downloads 462
4173 Dispersion Rate of Spilled Oil in Water Column under Non-Breaking Water Waves

Authors: Hanifeh Imanian, Morteza Kolahdoozan

Abstract:

The purpose of this study is to present a mathematical phrase for calculating the dispersion rate of spilled oil in water column under non-breaking waves. In this regard, a multiphase numerical model is applied for which waves and oil phase were computed concurrently, and accuracy of its hydraulic calculations have been proven. More than 200 various scenarios of oil spilling in wave waters were simulated using the multiphase numerical model and its outcome were collected in a database. The recorded results were investigated to identify the major parameters affected vertical oil dispersion and finally 6 parameters were identified as main independent factors. Furthermore, some statistical tests were conducted to identify any relationship between the dependent variable (dispersed oil mass in the water column) and independent variables (water wave specifications containing height, length and wave period and spilled oil characteristics including density, viscosity and spilled oil mass). Finally, a mathematical-statistical relationship is proposed to predict dispersed oil in marine waters. To verify the proposed relationship, a laboratory example available in the literature was selected. Oil mass rate penetrated in water body computed by statistical regression was in accordance with experimental data was predicted. On this occasion, it was necessary to verify the proposed mathematical phrase. In a selected laboratory case available in the literature, mass oil rate penetrated in water body computed by suggested regression. Results showed good agreement with experimental data. The validated mathematical-statistical phrase is a useful tool for oil dispersion prediction in oil spill events in marine areas.

Keywords: dispersion, marine environment, mathematical-statistical relationship, oil spill

Procedia PDF Downloads 208
4172 Evaluating Hourly Sulphur Dioxide and Ground Ozone Simulated with the Air Quality Model in Lima, Peru

Authors: Odón R. Sánchez-Ccoyllo, Elizabeth Ayma-Choque, Alan Llacza

Abstract:

Sulphur dioxide (SO₂) and surface-ozone (O₃) concentrations are associated with diseases. The objective of this research is to evaluate the effectiveness of the air-quality-WRF-Chem model with a horizontal resolution of 5 km x 5 km. For this purpose, the measurements of the hourly SO₂ and O₃ concentrations available in three air quality monitoring stations in Lima, Peru were used for the purpose of validating the simulations of the SO₂ and O₃ concentrations obtained with the WRF-Chem model in February 2018. For the quantitative evaluation of the simulations of these gases, statistical techniques were implemented, such as the average of the simulations; the average of the measurements; the Mean Bias (MeB); the Mean Error (MeE); and the Root Mean Square Error (RMSE). The results of these statistical metrics indicated that the simulated SO₂ and O₃ values over-predicted the SO₂ and O₃ measurements. For the SO₂ concentration, the MeB values varied from 0.58 to 26.35 µg/m³; the MeE values varied from 8.75 to 26.5 µg/m³; the RMSE values varied from 13.3 to 31.79 µg/m³; while for O₃ concentrations the statistical values of the MeB varied from 37.52 to 56.29 µg/m³; the MeE values varied from 37.54 to 56.70 µg/m³; the RMSE values varied from 43.05 to 69.56 µg/m³.

Keywords: ground-ozone, lima, sulphur dioxide, WRF-chem

Procedia PDF Downloads 105
4171 Setting Uncertainty Conditions Using Singular Values for Repetitive Control in State Feedback

Authors: Muhammad A. Alsubaie, Mubarak K. H. Alhajri, Tarek S. Altowaim

Abstract:

A repetitive controller designed to accommodate periodic disturbances via state feedback is discussed. Periodic disturbances can be represented by a time delay model in a positive feedback loop acting on system output. A direct use of the small gain theorem solves the periodic disturbances problem via 1) isolating the delay model, 2) finding the overall system representation around the delay model and 3) designing a feedback controller that assures overall system stability and tracking error convergence. This paper addresses uncertainty conditions for the repetitive controller designed in state feedback in either past error feedforward or current error feedback using singular values. The uncertainty investigation is based on the overall system found and the stability condition associated with it; depending on the scheme used, to set an upper/lower limit weighting parameter. This creates a region that should not be exceeded in selecting the weighting parameter which in turns assures performance improvement against system uncertainty. Repetitive control problem can be described in lifted form. This allows the usage of singular values principle in setting the range for the weighting parameter selection. The Simulation results obtained show a tracking error convergence against dynamic system perturbation if the weighting parameter chosen is within the range obtained. Simulation results also show the advantage of weighting parameter usage compared to the case where it is omitted.

Keywords: model mismatch, repetitive control, singular values, state feedback

Procedia PDF Downloads 124
4170 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference

Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov

Abstract:

Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.

Keywords: canonical correlation analysis, power curve, power performance, wind energy

Procedia PDF Downloads 306
4169 The Coexistence of Creativity and Information in Convergence Journalism: Pakistan's Evolving Media Landscape

Authors: Misha Mirza

Abstract:

In recent years, the definition of journalism in Pakistan has changed, so has the mindset of people and their approach towards a news story. For the audience, news has become more interesting than a drama or a film. This research thus provides an insight into Pakistan’s evolving media landscape. It tries not only to bring forth the outcomes of cross-platform cooperation among print and broadcast journalism but also gives an insight into the interactive data visualization techniques being used. The storytelling in journalism in Pakistan has evolved from depicting merely the truth to tweaking, fabricating and producing docu-dramas. It aims to look into how news is translated to a visual. Pakistan acquires a diverse cultural heritage and by engaging audience through media, this history translates into the storytelling platform today. The paper explains how journalists are thriving in a converging media environment and provides an analysis of the narratives in television talk shows today.’ Jack of all, master of none’ is being challenged by the journalists today. One has to be a quality information gatherer and an effective storyteller at the same time. Are journalists really looking more into what sells rather than what matters? Express Tribune is a very popular news platform among the youth. Not only is their newspaper more attractive than the competitors but also their style of narrative and interactive web stories lead to well-rounded news. Interviews are used as the basic methodology to get an insight into how data visualization is compassed. The quest for finding out the difference between visualization of information versus the visualization of knowledge has led the author to delve into the work of David McCandless in his book ‘Knowledge is beautiful’. Journalism in Pakistan has evolved from information to combining knowledge, infotainment and comedy. What is being criticized the most by the society most often becomes the breaking news. Circulation in today’s world is carried out in cultural and social networks. In recent times, we have come across many examples where people have gained overnight popularity by releasing songs with substandard lyrics or senseless videos perhaps because creativity has taken over information. This paper thus discusses the various platforms of convergence journalism from Pakistan’s perspective. The study concludes with proving how Pakistani pop culture Truck art is coexisting with all the platforms in convergent journalism. The changing media landscape thus challenges the basic rules of journalism. The slapstick humor and ‘jhatka’ in Pakistani talk shows has evolved from the Pakistani truck art poetry. Mobile journalism has taken over all the other mediums of journalism; however, the Pakistani culture coexists with the converging landscape.

Keywords: convergence journalism in Pakistan, data visualization, interactive narrative in Pakistani news, mobile journalism, Pakistan's truck art culture

Procedia PDF Downloads 254
4168 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 400
4167 Statistical Design of Synthetic VP X-bar Control Chat Using Markov Chain Approach

Authors: Ali Akbar Heydari

Abstract:

Control charts are an important tool of statistical quality control. Thesecharts are used to detect and eliminate unwanted special causes of variation that occurred during aperiod of time. The design and operation of control charts require the determination of three design parameters: the sample size (n), the sampling interval (h), and the width coefficient of control limits (k). Thevariable parameters (VP) x-bar controlchart is the x-barchart in which all the design parameters vary between twovalues. These values are a function of the most recent process information. In fact, in the VP x-bar chart, the position of each sample point on the chart establishes the size of the next sample and the timeof its sampling. The synthetic x-barcontrol chartwhich integrates the x-bar chart and the conforming run length (CRL) chart, provides significant improvement in terms of detection power over the basic x-bar chart for all levels of mean shifts. In this paper, we introduce the syntheticVP x-bar control chart for monitoring changes in the process mean. To determine the design parameters, we used a statistical design based on the minimum out of control average run length (ARL) criteria. The optimal chart parameters of the proposed chart are obtained using the Markov chain approach. A numerical example is also done to show the performance of the proposed chart and comparing it with the other control charts. The results show that our proposed syntheticVP x-bar controlchart perform better than the synthetic x-bar controlchart for all shift parameter values. Also, the syntheticVP x-bar controlchart perform better than the VP x-bar control chart for the moderate or large shift parameter values.

Keywords: control chart, markov chain approach, statistical design, synthetic, variable parameter

Procedia PDF Downloads 132
4166 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 155
4165 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables

Procedia PDF Downloads 307
4164 Series-Parallel Systems Reliability Optimization Using Genetic Algorithm and Statistical Analysis

Authors: Essa Abrahim Abdulgader Saleem, Thien-My Dao

Abstract:

The main objective of this paper is to optimize series-parallel system reliability using Genetic Algorithm (GA) and statistical analysis; considering system reliability constraints which involve the redundant numbers of selected components, total cost, and total weight. To perform this work, firstly the mathematical model which maximizes system reliability subject to maximum system cost and maximum system weight constraints is presented; secondly, a statistical analysis is used to optimize GA parameters, and thirdly GA is used to optimize series-parallel systems reliability. The objective is to determine the strategy choosing the redundancy level for each subsystem to maximize the overall system reliability subject to total cost and total weight constraints. Finally, the series-parallel system case study reliability optimization results are showed, and comparisons with the other previous results are presented to demonstrate the performance of our GA.

Keywords: reliability, optimization, meta-heuristic, genetic algorithm, redundancy

Procedia PDF Downloads 312
4163 A Very Efficient Pseudo-Random Number Generator Based On Chaotic Maps and S-Box Tables

Authors: M. Hamdi, R. Rhouma, S. Belghith

Abstract:

Generating random numbers are mainly used to create secret keys or random sequences. It can be carried out by various techniques. In this paper we present a very simple and efficient pseudo-random number generator (PRNG) based on chaotic maps and S-Box tables. This technique adopted two main operations one to generate chaotic values using two logistic maps and the second to transform them into binary words using random S-Box tables. The simulation analysis indicates that our PRNG possessing excellent statistical and cryptographic properties.

Keywords: Random Numbers, Chaotic map, S-box, cryptography, statistical tests

Procedia PDF Downloads 332
4162 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function

Procedia PDF Downloads 266
4161 Assessing the Accessibility to Primary Percutaneous Coronary Intervention

Authors: Tzu-Jung Tseng, Pei-Hsuen Han, Tsung-Hsueh Lu

Abstract:

Background: Ensuring patients with ST-elevation myocardial infarction (STEMI) access to hospitals that could perform percutaneous coronary intervention (PCI) in time is an important concern of healthcare managers. One commonly used the method to assess the coverage of population access to PCI hospital is the use GIS-estimated linear distance (crow's fly distance) between the district centroid and the nearest PCI hospital. If the distance is within a given distance (such as 20 km), the entire population of that district is considered to have appropriate access to PCI. The premise of using district centroid to estimate the coverage of population resident in that district is that the people live in the district are evenly distributed. In reality, the population density is not evenly distributed within the administrative district, especially in rural districts. Fortunately, the Taiwan government released basic statistical area (on average 450 population within the area) recently, which provide us an opportunity to estimate the coverage of population access to PCI services more accurate. Objectives: We aimed in this study to compare the population covered by a give PCI hospital according to traditional administrative district versus basic statistical area. We further examined if the differences between two geographic units used would be larger in a rural area than in urban area. Method: We selected two hospitals in Tainan City for this analysis. Hospital A is in urban area, hospital B is in rural area. The population in each traditional administrative district and basic statistical area are obtained from Taiwan National Geographic Information System, Ministry of Internal Affairs. Results: Estimated population live within 20 km of hospital A and B was 1,515,846 and 323,472 according to traditional administrative district and was 1,506,325 and 428,556 according to basic statistical area. Conclusion: In urban area, the estimated access population to PCI services was similar between two geographic units. However, in rural areas, the access population would be overestimated.

Keywords: accessibility, basic statistical area, modifiable areal unit problem (MAUP), percutaneous coronary intervention (PCI)

Procedia PDF Downloads 430
4160 Digital Value Co-Creation: The Case of Worthy a Virtual Collaborative Museum across Europe

Authors: Camilla Marini, Deborah Agostino

Abstract:

Cultural institutions provide more than service-based offers; indeed, they are experience-based contexts. A cultural experience is a special event that encompasses a wide range of values which, for visitors, are primarily cultural rather than economic and financial. Cultural institutions have always been characterized by inclusivity and participatory practices, but the upcoming of digital technologies has put forward their interest in collaborative practices and the relationship with their audience. Indeed, digital technologies highly affected the cultural experience as it was conceived. Especially, museums, as traditional and authoritative cultural institutions, have been highly challenged by digital technologies. They shifted by a collection-oriented toward a visitor-centered approach, and digital technologies generated a highly interactive ecosystem in which visitors have an active role, shaping their own cultural experience. Most of the studies that investigate value co-creation in museums adopt a single perspective which is separately one of the museums or one of the users, but the analysis of the convergence/divergence of these perspectives is still emphasized. Additionally, many contributions focus on digital value co-creation as an outcome rather than as a process. The study aims to provide a joint perspective on digital value co-creation which include both museum and visitors. Also, it deepens the contribution of digital technologies in the value co-creation process, addressing the following research questions: (i) what are the convergence/divergence drivers on digital value co-creation and (ii) how digital technologies can be means of value co-creation? The study adopts an action research methodology that is based on the case of WORTHY, an educational project which involves cultural institutions and schools all around Europe, creating a virtual collaborative museum. It represents a valuable case for the aim of the study since it has digital technologies at its core, and the interaction through digital technologies is fundamental, all along with the experience. Action research has been identified as the most appropriate methodology for researchers to have direct contact with the field. Data have been collected through primary and secondary sources. Cultural mediators such as museums, teachers and students’ families have been interviewed, while a focus group has been designed to interact with students, investigating all the aspects of the cultural experience. Secondary sources encompassed project reports and website contents in order to deepen the perspective of cultural institutions. Preliminary findings highlight the dimensions of digital value co-creation in cultural institutions from a museum-visitor integrated perspective and the contribution of digital technologies in the value co-creation process. The study outlines a two-folded contribution that encompasses both an academic and a practitioner level. Indeed, it contributes to fulfilling the gap in cultural management literature about the convergence/divergence of service provider-user perspectives but it also provides cultural professionals with guidelines on how to evaluate the digital value co-creation process.

Keywords: co-creation, digital technologies, museum, value

Procedia PDF Downloads 120
4159 An Empirical Study of the International Financial Reporting Standards Education in the United States

Authors: Angela McCaskill

Abstract:

Accounting graduates in most United States universities are not being adequately taught International Financial Reporting Standards (IFRS). As such they are not prepared with the knowledge and skills necessary to remain competitive in international businesses. One of the reasons behind the ill preparation is the lack of specific international accounting instruction available in the U.S. This paper explores the importance of IFRS education through the lenses of graduate accounting majors. The paper specifically explores graduate accounting major’s preparedness in IFRS based on their recent completion of a Master in Accountancy degree where IFRS had been integrated. The data for the study was collected via face-to face and telephone/Skype interviews and questionnaires. After the interview the participants also agreed to answer two supplementary questions. The participants were to determine the amounts that should be reported on the balance sheet under (1) IFRS and (2) U.S. GAAP. These questions intended to test their knowledge of both sets of standards. The sample consisted of on-line and brick and mortar university students enrolled in their graduate program during the period from spring semester 2016 to summer semester 2016. This study shows that a separate course should be devoted to teaching IFRS and convergence related issues. There is a direct correlation between the knowledge level of those students taking an IFRS course and the successful completion of the supplementary questions compared to those who only had IFRS instruction mixed into their U.S. GAAP based instruction. Students who took an international accounting course were better prepared for the IFRS conversion than those who did not have a separate course. Academically, universities need to take a deeper look into the needs of their students and do better at incorporating international standards in their curriculum.

Keywords: accounting education, global accounting standards, international accounting, IFRS and U.S. GAAP convergence, IFRS, U.S. GAAP

Procedia PDF Downloads 226
4158 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 242
4157 Numerical Study of Elastic Performances of Sandwich Beam with Carbon-Fibre Reinforced Skins

Authors: Soukaina Ounss, Hamid Mounir, Abdellatif El Marjani

Abstract:

Sandwich materials with composite reinforced skins are mostly required in advanced construction applications with a view to ensure resistant structures. Their lightweight, their high flexural stiffness and their optimal thermal insulation make them a suitable solution to obtain efficient structures with performing rigidity and optimal energy safety. In this paper, the mechanical behavior of a sandwich beam with composite skins reinforced by unidirectional carbon fibers is investigated numerically through analyzing the impact of reinforcements specifications on the longitudinal elastic modulus in order to select the adequate sandwich configuration that has an interesting rigidity and an accurate convergence to the analytical approach which is proposed to verify performed numerical simulations. Therefore, concerned study starts by testing flexion performances of skins with various fibers orientations and volume fractions to determine those to use in sandwich beam. For that, the combination of a reinforcement inclination of 30° and a volume ratio of 60% is selected with the one with 60° of fibers orientation and 40% of volume fraction, this last guarantees to chosen skins an important rigidity with an optimal fibers concentration and a great enhance in convergence to analytical results in the sandwich model for the reason of the crucial core role as transverse shear absorber. Thus, a resistant sandwich beam is elaborated from a face-sheet constituted from two layers of previous skins with fibers oriented in 60° and an epoxy core; concerned beam has a longitudinal elastic modulus of 54 Gpa (gigapascal) that equals to the analytical value by a negligible error of 2%.

Keywords: fibers orientation, fibers volume ratio, longitudinal elastic modulus, sandwich beam

Procedia PDF Downloads 126
4156 Statistical Analysis and Impact Forecasting of Connected and Autonomous Vehicles on the Environment: Case Study in the State of Maryland

Authors: Alireza Ansariyar, Safieh Laaly

Abstract:

Over the last decades, the vehicle industry has shown increased interest in integrating autonomous, connected, and electrical technologies in vehicle design with the primary hope of improving mobility and road safety while reducing transportation’s environmental impact. Using the State of Maryland (M.D.) in the United States as a pilot study, this research investigates CAVs’ fuel consumption and air pollutants (C.O., PM, and NOx) and utilizes meaningful linear regression models to predict CAV’s environmental effects. Maryland transportation network was simulated in VISUM software, and data on a set of variables were collected through a comprehensive survey. The number of pollutants and fuel consumption were obtained for the time interval 2010 to 2021 from the macro simulation. Eventually, four linear regression models were proposed to predict the amount of C.O., NOx, PM pollutants, and fuel consumption in the future. The results highlighted that CAVs’ pollutants and fuel consumption have a significant correlation with the income, age, and race of the CAV customers. Furthermore, the reliability of four statistical models was compared with the reliability of macro simulation model outputs in the year 2030. The error of three pollutants and fuel consumption was obtained at less than 9% by statistical models in SPSS. This study is expected to assist researchers and policymakers with planning decisions to reduce CAV environmental impacts in M.D.

Keywords: connected and autonomous vehicles, statistical model, environmental effects, pollutants and fuel consumption, VISUM, linear regression models

Procedia PDF Downloads 417
4155 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 398
4154 The Use of Multivariate Statistical and GIS for Characterization Groundwater Quality in Laghouat Region, Algeria

Authors: Rouighi Mustapha, Bouzid Laghaa Souad, Rouighi Tahar

Abstract:

Due to rain Shortage and the increase of population in the last years, wells excavation and groundwater use for different purposes had been increased without any planning. This is a great challenge for our country. Moreover, this scarcity of water resources in this region is unfortunately combined with rapid fresh water resources quality deterioration, due to salinity and contamination processes. Therefore, it is necessary to conduct the studies about groundwater quality in Algeria. In this work consists in the identification of the factors which influence the water quality parameters in Laghouat region by using statistical analysis Principal Component Analysis (PCA), Hierarchical Cluster Analysis (HCA) and geographic information system (GIS) in an attempt to discriminate the sources of the variation of water quality variations. The results of PCA technique indicate that variables responsible for water quality composition are mainly related to soluble salts variables; natural processes and the nature of the rock which modifies significantly the water chemistry. Inferred from the positive correlation between K+ and NO3-, NO3- is believed to be human induced rather than naturally originated. In this study, the multivariate statistical analysis and GIS allows the hydrogeologist to have supplementary tools in the characterization and evaluating of aquifers.

Keywords: cluster, analysis, GIS, groundwater, laghouat, quality

Procedia PDF Downloads 292
4153 Lexical Collocations in Medical Articles of Non-Native vs Native English-Speaking Researchers

Authors: Waleed Mandour

Abstract:

This study presents multidimensional scrutiny of Benson et al.’s seven-category taxonomy of lexical collocations used by Egyptian medical authors and their peers of native-English speakers. It investigates 212 medical papers, all published during a span of 6 years (from 2013 to 2018). The comparison is held to the medical research articles submitted by native speakers of English (25,238 articles in total with over 103 million words) as derived from the Directory of Open Access Journals (a 2.7 billion-word corpus). The non-native speakers compiled corpus was properly annotated and marked-up manually by the researcher according to the standards of Weisser. In terms of statistical comparisons, though, deployed were the conventional frequency-based analysis besides the relevant criteria, such as association measures (AMs) in which LogDice is deployed as per the recommendation of Kilgariff et al. when comparing large corpora. Despite the terminological convergence in the subject corpora, comparison results confirm the previous literature of which the non-native speakers’ compositions reveal limited ranges of lexical collocations in terms of their distribution. However, there is a ubiquitous tendency of overusing the NS-high-frequency multi-words in all lexical categories investigated. Furthermore, Egyptian authors, conversely to their English-speaking peers, tend to embrace more collocations denoting quantitative rather than qualitative analyses in their produced papers. This empirical work, per se, contributes to the English for Academic Purposes (EAP) and English as a Lingua Franca in Academic settings (ELFA). In addition, there are pedagogical implications that would promote a better quality of medical research papers published in Egyptian universities.

Keywords: corpus linguistics, EAP, ELFA, lexical collocations, medical discourse

Procedia PDF Downloads 103
4152 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 702
4151 Statistical Analysis of Surface Roughness and Tool Life Using (RSM) in Face Milling

Authors: Mohieddine Benghersallah, Lakhdar Boulanouar, Salim Belhadi

Abstract:

Currently, higher production rate with required quality and low cost is the basic principle in the competitive manufacturing industry. This is mainly achieved by using high cutting speed and feed rates. Elevated temperatures in the cutting zone under these conditions shorten tool life and adversely affect the dimensional accuracy and surface integrity of component. Thus it is necessary to find optimum cutting conditions (cutting speed, feed rate, machining environment, tool material and geometry) that can produce components in accordance with the project and having a relatively high production rate. Response surface methodology is a collection of mathematical and statistical techniques that are useful for modelling and analysis of problems in which a response of interest is influenced by several variables and the objective is to optimize this response. The work presented in this paper examines the effects of cutting parameters (cutting speed, feed rate and depth of cut) on to the surface roughness through the mathematical model developed by using the data gathered from a series of milling experiments performed.

Keywords: Statistical analysis (RSM), Bearing steel, Coating inserts, Tool life, Surface Roughness, End milling.

Procedia PDF Downloads 401
4150 Efficient Study of Substrate Integrated Waveguide Devices

Authors: J. Hajri, H. Hrizi, N. Sboui, H. Baudrand

Abstract:

This paper presents a study of SIW circuits (Substrate Integrated Waveguide) with a rigorous and fast original approach based on Iterative process (WCIP). The theoretical suggested study is validated by the simulation of two different examples of SIW circuits. The obtained results are in good agreement with those of measurement and with software HFSS.

Keywords: convergence study, HFSS, modal decomposition, SIW circuits, WCIP method

Procedia PDF Downloads 468
4149 Desing of PSS and SVC to Improve Power System Stability

Authors: Mahmoud Samkan

Abstract:

In this paper, the design and assessment of new coordination between Power System Stabilizers (PSSs) and Static Var Compensator (SVC) in a multimachine power system via statistical method are proposed. The coordinated design problem of PSSs and SVC over a wide range of loading conditions is handled as an optimization problem. The Bacterial Swarming Optimization (BSO), which synergistically couples the Bacterial Foraging (BF) with the Particle Swarm Optimization (PSO), is employed to seek for optimal controllers parameters. By minimizing the proposed objective function, in which the speed deviations between generators are involved; stability performance of the system is enhanced. To compare the capability of PSS and SVC, both are designed independently, and then in a coordinated manner. Simultaneous tuning of the BSO based coordinated controller gives robust damping performance over wide range of operating conditions and large disturbance in compare to optimized PSS controller based on BSO (BSOPSS) and optimized SVC controller based on BSO (BSOSVC). Moreover, a statistical T test is executed to validate the robustness of coordinated controller versus uncoordinated one.

Keywords: SVC, PSSs, multimachine power system, coordinated design, bacteria swarm optimization, statistical assessment

Procedia PDF Downloads 353