Search results for: software reliability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6554

Search results for: software reliability

4754 Use of a Business Intelligence Software for Interactive Visualization of Data on the Swiss Elite Sports System

Authors: Corinne Zurmuehle, Andreas Christoph Weber

Abstract:

In 2019, the Swiss Federal Institute of Sport Magglingen (SFISM) conducted a mixed-methods study on the Swiss elite sports system, which yielded a large quantity of research data. In a quantitative online survey, 1151 elite sports athletes, 542 coaches, and 102 Performance Directors of national sports federations (NF) have submitted their perceptions of the national support measures of the Swiss elite sports system. These data provide an essential database for the further development of the Swiss elite sports system. The results were published in a report presenting the results divided into 40 Olympic summer and 14 winter sports (Olympic classification). The authors of this paper assume that, in practice, this division is too unspecific to assess where further measures would be needed. The aim of this paper is to find appropriate parameters for data visualization in order to identify disparities in sports promotion that allow an assessment of where further interventions by Swiss Olympic (NF umbrella organization) are required. Method: First, the variable 'salary earned from sport' was defined as a variable to measure the impact of elite sports promotion. This variable was chosen as a measure as it represents an important indicator for the professionalization of elite athletes and therefore reflects national level sports promotion measures applied by Swiss Olympic. Afterwards, the variable salary was tested with regard to the correlation between Olympic classification [a], calculating the Eta coefficient. To estimate the appropriate parameters for data visualization, the correlation between salary and four further parameters was analyzed by calculating the Eta coefficient: [a] sport; [b] prioritization (from 1 to 5) of the sports by Swiss Olympic; [c] gender; [d] employment level in sports. Results & Discussion: The analyses reveal a very small correlation between salary and Olympic classification (ɳ² = .011, p = .005). Gender demonstrates an even small correlation (ɳ² = .006, p = .014). The parameter prioritization was correlating with small effect (ɳ² = .017, p = .001) as did employment level (ɳ² = .028, p < .001). The highest correlation was identified by the parameter sport with a moderate effect (ɳ² = .075, p = .047). The analyses show that the disparities in sports promotion cannot be determined by a particular parameter but presumably explained by a combination of several parameters. We argue that the possibility of combining parameters for data visualization should be enabled when the analysis is provided to Swiss Olympic for further strategic decision-making. However, the inclusion of multiple parameters massively multiplies the number of graphs and is therefore not suitable for practical use. Therefore, we suggest to apply interactive dashboards for data visualization using Business Intelligence Software. Practical & Theoretical Contribution: This contribution provides the first attempt to use Business Intelligence Software for strategic decision-making in national level sports regarding the prioritization of national resources for sports and athletes. This allows to set specific parameters with a significant effect as filters. By using filters, parameters can be combined and compared against each other and set individually for each strategic decision.

Keywords: data visualization, business intelligence, Swiss elite sports system, strategic decision-making

Procedia PDF Downloads 90
4753 Kazakh Language Assessment in a New Multilingual Kazakhstan

Authors: Karlygash Adamova

Abstract:

This article is focused on the KazTest as one of the most important high-stakes tests and the key tool in Kazakh language assessment. The research will also include the brief introduction to the language policy in Kazakhstan. Particularly, it is going to be changed significantly and turn from bilingualism (Kazakh, Russian) to multilingual policy (three languages - Kazakh, Russian, English). Therefore, the current status of the abovementioned languages will be described. Due to the various educational reforms in the country, the language evaluation system should also be improved and moderated. The research will present the most significant test of Kazakhstan – the KazTest, which is aimed to evaluate the Kazakh language proficiency. Assessment is an ongoing process that encompasses a wide area of knowledge upon the productive performance of the learners. Test is widely defined as a standardized or standard method of research, testing, diagnostics, verification, etc. The two most important characteristics of any test, as the main element of the assessment - validity and reliability - will also be described in this paper. Therefore, the preparation and design of the test, which is assumed to be an indicator of knowledge, and it is highly important to take into account all these properties.

Keywords: multilingualism, language assessment, testing, language policy

Procedia PDF Downloads 136
4752 Pharmacokinetic Modeling of Valsartan in Dog following a Single Oral Administration

Authors: In-Hwan Baek

Abstract:

Valsartan is a potent and highly selective antagonist of the angiotensin II type 1 receptor, and is widely used for the treatment of hypertension. The aim of this study was to investigate the pharmacokinetic properties of the valsartan in dogs following oral administration of a single dose using quantitative modeling approaches. Forty beagle dogs were randomly divided into two group. Group A (n=20) was administered a single oral dose of valsartan 80 mg (Diovan® 80 mg), and group B (n=20) was administered a single oral dose of valsartan 160 mg (Diovan® 160 mg) in the morning after an overnight fast. Blood samples were collected into heparinized tubes before and at 0.5, 1, 1.5, 2, 2.5, 3, 4, 6, 8, 12 and 24 h following oral administration. The plasma concentrations of the valsartan were determined using LC-MS/MS. Non-compartmental pharmacokinetic analyses were performed using WinNonlin Standard Edition software, and modeling approaches were performed using maximum-likelihood estimation via the expectation maximization (MLEM) algorithm with sampling using ADAPT 5 software. After a single dose of valsartan 80 mg, the mean value of maximum concentration (Cmax) was 2.68 ± 1.17 μg/mL at 1.83 ± 1.27 h. The area under the plasma concentration-versus-time curve from time zero to the last measurable concentration (AUC24h) value was 13.21 ± 6.88 μg·h/mL. After dosing with valsartan 160 mg, the mean Cmax was 4.13 ± 1.49 μg/mL at 1.80 ± 1.53 h, the AUC24h was 26.02 ± 12.07 μg·h/mL. The Cmax and AUC values increased in proportion to the increment in valsartan dose, while the pharmacokinetic parameters of elimination rate constant, half-life, apparent of total clearance, and apparent of volume of distribution were not significantly different between the doses. Valsartan pharmacokinetic analysis fits a one-compartment model with first-order absorption and elimination following a single dose of valsartan 80 mg and 160 mg. In addition, high inter-individual variability was identified in the absorption rate constant. In conclusion, valsartan displays the dose-dependent pharmacokinetics in dogs, and Subsequent quantitative modeling approaches provided detailed pharmacokinetic information of valsartan. The current findings provide useful information in dogs that will aid future development of improved formulations or fixed-dose combinations.

Keywords: dose-dependent, modeling, pharmacokinetics, valsartan

Procedia PDF Downloads 297
4751 Psychometric Properties and Factor Structure of the College Readiness Questionnaire

Authors: Muna Al-Kalbani, Thuwayba Al Barwani, Otherine Neisler, Hussain Alkharusi, David Clayton, Humaira Al-Sulaimani, Mohammad Khan, Hamad Al-Yahmadi

Abstract:

This study describes the psychometric properties and factor structure of the University Readiness Survey (URS). Survey data were collected from sample of 2652 students from Sultan Qaboos University. Exploratory factor analysis identified ten significant factors underlining the structure. The results of Confirmatory factor analysis showed a good fit to the data where the indices for the revised model were χ2(df = 1669) = 6093.4; CFI = 0.900; GFI =0.926; PCLOSE = 1.00 and RMSAE = 0.030 where each of these indices were above threshold. The overall value of Cronbach’s alpha was 0.899 indicating that the instrument score was reliable. Results imply that the URS is a valid measure describing the college readiness pattern among Sultan Qaboos University students and the Arabic version could be used by university counselors to identify students’ readiness factors. Nevertheless, further validation of the of the USR is recommended.

Keywords: college readiness, confirmatory factor analysis, reliability, validity

Procedia PDF Downloads 226
4750 Generalized Additive Model Approach for the Chilean Hake Population in a Bio-Economic Context

Authors: Selin Guney, Andres Riquelme

Abstract:

The traditional bio-economic method for fisheries modeling uses some estimate of the growth parameters and the system carrying capacity from a biological model for the population dynamics (usually a logistic population growth model) which is then analyzed as a traditional production function. The stock dynamic is transformed into a revenue function and then compared with the extraction costs to estimate the maximum economic yield. In this paper, the logistic population growth model for the population is combined with a forecast of the abundance and location of the stock by using a generalized additive model approach. The paper focuses on the Chilean hake population. This method allows for the incorporation of climatic variables and the interaction with other marine species, which in turn will increase the reliability of the estimates and generate better extraction paths for different conservation objectives, such as the maximum biological yield or the maximum economic yield.

Keywords: bio-economic, fisheries, GAM, production

Procedia PDF Downloads 252
4749 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 326
4748 Equivalent Circuit Model for the Eddy Current Damping with Frequency-Dependence

Authors: Zhiguo Shi, Cheng Ning Loong, Jiazeng Shan, Weichao Wu

Abstract:

This study proposes an equivalent circuit model to simulate the eddy current damping force with shaking table tests and finite element modeling. The model is firstly proposed and applied to a simple eddy current damper, which is modelled in ANSYS, indicating that the proposed model can simulate the eddy current damping force under different types of excitations. Then, a non-contact and friction-free eddy current damper is designed and tested, and the proposed model can reproduce the experimental observations. The excellent agreement between the simulated results and the experimental data validates the accuracy and reliability of the equivalent circuit model. Furthermore, a more complicated model is performed in ANSYS to verify the feasibility of the equivalent circuit model in complex eddy current damper, and the higher-order fractional model and viscous model are adopted for comparison.

Keywords: equivalent circuit model, eddy current damping, finite element model, shake table test

Procedia PDF Downloads 191
4747 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models

Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo De Magalhães

Abstract:

This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.

Keywords: rainfall-runoff models, automatic calibration, hyperbolic smoothing method

Procedia PDF Downloads 149
4746 Submodeling of Mega-Shell Reinforced Concrete Solar Chimneys

Authors: Areeg Shermaddo, Abedulgader Baktheer

Abstract:

Solar updraft power plants (SUPPs) made from reinforced concrete (RC) are an innovative technology to generate solar electricity. An up to 1000 m high chimney represents the major part of each SUPP ensuring the updraft of the warmed air from the ground. Numerical simulation of nonlinear behavior of such large mega shell concrete structures is a challenging task, and computationally expensive. A general finite element approach to simulate reinforced concrete bearing behavior is presented and verified on a simply supported beam, as well as the technique of submodeling. The verified numerical approach is extended and consecutively transferred to a more complex chimney structure of a SUPP. The obtained results proved the reliability of submodeling technique in analyzing critical regions of simple and complex mega concrete structures with high accuracy and dramatic decrease in the computation time.

Keywords: ABAQUS, nonlinear analysis, submodeling, SUPP

Procedia PDF Downloads 219
4745 Effects of Non-Motorized Vehicles on a Selected Intersection in Dhaka City for Non Lane Based Heterogeneous Traffic Using VISSIM 5.3

Authors: A. C. Dey, H. M. Ahsan

Abstract:

Heterogeneous traffic composed of both motorized and non-motorized vehicles that are a common feature of urban Bangladeshi roads. Popular non-motorized vehicles include rickshaws, rickshaw-van, and bicycle. These modes performed an important role in moving people and goods in the absence of a dependable mass transport system. However, rickshaws play a major role in meeting the demand for door-to-door public transport services to the city dwellers. But there is no separate lane for non-motorized vehicles in this city. Non-motorized vehicles generally occupy the outermost or curb-side lanes, however, at intersections non-motorized vehicles get mixed with the motorized vehicles. That’s why the conventional models fail to analyze the situation completely. Microscopic traffic simulation software VISSIM 5.3, itself a lane base software but default behavioral parameters [such as driving behavior, lateral distances, overtaking tendency, CCO=0.4m, CC1=1.5s] are modified for calibrating a model to analyze the effects of non-motorized traffic at an intersection (Mirpur-10) in a non-lane based mixed traffic condition. It is seen from field data that NMV occupies an average 20% of the total number of vehicles almost all the link roads. Due to the large share of non-motorized vehicles, capacity significantly drop. After analyzing simulation raw data, significant variation is noticed. Such as the average vehicular speed is reduced by 25% and the number of vehicles decreased by 30% only for the presence of NMV. Also the variation of lateral occupancy and queue delay time increase by 2.37% and 33.75% respectively. Thus results clearly show the negative effects of non-motorized vehicles on capacity at an intersection. So special management technics or restriction of NMV at major intersections may be an effective solution to improve this existing critical condition.

Keywords: lateral occupancy, non lane based intersection, nmv, queue delay time, VISSIM 5.3

Procedia PDF Downloads 155
4744 Intelligent Process and Model Applied for E-Learning Systems

Authors: Mafawez Alharbi, Mahdi Jemmali

Abstract:

E-learning is a developing area especially in education. E-learning can provide several benefits to learners. An intelligent system to collect all components satisfying user preferences is so important. This research presents an approach that it capable to personalize e-information and give the user their needs following their preferences. This proposal can make some knowledge after more evaluations made by the user. In addition, it can learn from the habit from the user. Finally, we show a walk-through to prove how intelligent process work.

Keywords: artificial intelligence, architecture, e-learning, software engineering, processing

Procedia PDF Downloads 191
4743 Assessing Image Quality in Mobile Radiography: A Phantom-Based Evaluation of a New Lightweight Mobile X-Ray Equipment

Authors: May Bazzi, Shafik Tokmaj, Younes Saberi, Mats Geijer, Tony Jurkiewicz, Patrik Sund, Anna Bjällmark

Abstract:

Mobile radiography, employing portable X-ray equipment, has become a routine procedure within hospital settings, with chest X-rays in intensive care units standing out as the most prevalent mobile X-ray examinations. This approach is not limited to hospitals alone, as it extends its benefits to imaging patients in various settings, particularly those too frail to be transported, such as elderly care residents in nursing homes. Moreover, the utility of mobile X-ray isn't confined solely to traditional healthcare recipients; it has proven to be a valuable resource for vulnerable populations, including the homeless, drug users, asylum seekers, and patients with multiple co-morbidities. Mobile X-rays reduce patient stress, minimize costly hospitalizations, and offer cost-effective imaging. While studies confirm its reliability, further research is needed, especially regarding image quality. Recent advancements in lightweight equipment with enhanced battery and detector technology provide the potential for nearly handheld radiography. The main aim of this study was to evaluate a new lightweight mobile X-ray system with two different detectors and compare the image quality with a modern stationary system. Methods: A total of 74 images of the chest (chest anterior-posterior (AP) views and chest lateral views) and pelvic/hip region (AP pelvis views, hip AP views, and hip cross-table lateral views) were acquired on a whole-body phantom (Kyotokagaku, Japan), utilizing varying image parameters. These images were obtained using a stationary system - 18 images (Mediel, Sweden), a mobile X-ray system with a second-generation detector - 28 images (FDR D-EVO II; Fujifilm, Japan) and a mobile X-ray system with a third-generation detector - 28 images (FDR D-EVO III; Fujifilm, Japan). Image quality was assessed by visual grading analysis (VGA), which is a method to measure image quality by assessing the visibility and accurate reproduction of anatomical structures within the images. A total of 33 image criteria were used in the analysis. A panel of two experienced radiologists, two experienced radiographers, and two final-term radiographer students evaluated the image quality on a 5-grade ordinal scale using the software Viewdex 3.0 (Viewer for Digital Evaluation of X-ray images, Sweden). Data were analyzed using visual grading characteristics analysis. The dose was measured by the dose-area product (DAP) reported by the respective systems. Results: The mobile X-ray equipment (both detectors) showed significantly better image quality than the stationary equipment for the pelvis, hip AP and hip cross-table lateral images with AUCVGA-values ranging from 0.64-0.92, while chest images showed mixed results. The number of images rated as having sufficient quality for diagnostic use was significantly higher for mobile X-ray generation 2 and 3 compared with the stationary X-ray system. The DAP values were higher for the stationary compared to the mobile system. Conclusions: The new lightweight radiographic equipment had an image quality at least as good as a fixed system at a lower radiation dose. Future studies should focus on clinical images and consider radiographers' viewpoints for a comprehensive assessment.

Keywords: mobile x-ray, visual grading analysis, radiographer, radiation dose

Procedia PDF Downloads 65
4742 An ab initioStudy of the Structural, Elastic, Electronic, and Optical Properties of the Perovskite ScRhO3

Authors: L. Foudia, K. Haddadi, M. Reffas

Abstract:

First principles study of structural, elastic, electronic and optical properties of the monoclinic perovskite type ScRhO₃ has been reported using the pseudo-potential plane wave method within the local density approximation. The calculated lattice parameters, including the lattice constants and angle β, are in excellent agreement with the available experimental data, which proving the reliability of the chosen theoretical approach. Pressure dependence up to 20 GPa of the single crystal and polycrystalline elastic constants has been investigated in details using the strain-stress approach. The mechanical stability, ductility, average elastic wave velocity, Debye temperature and elastic anisotropy were also assessed. Electronic band structure and density of states (DOS) demonstrated its semiconducting nature showing a direct band gap of 1.38 eV. Furthermore, several optical properties, such as absorption coefficient, reflectivity, refractive index, dielectric function, optical conductivity and electron energy loss function, have been calculated for radiation up to 40 eV.

Keywords: ab-initio, perovskite, DFT, band gap

Procedia PDF Downloads 80
4741 Simulation Research of the Aerodynamic Drag of 3D Structures for Individual Transport Vehicle

Authors: Pawel Magryta, Mateusz Paszko

Abstract:

In today's world, a big problem of individual mobility, especially in large urban areas, occurs. Commonly used grand way of transport such as buses, trains or cars do not fulfill their tasks, i.e. they are not able to meet the increasing mobility needs of the growing urban population. Additional to that, the limitations of civil infrastructure construction in the cities exist. Nowadays the most common idea is to transfer the part of urban transport on the level of air transport. However to do this, there is a need to develop an individual flying transport vehicle. The biggest problem occurring in this concept is the type of the propulsion system from which the vehicle will obtain a lifting force. Standard propeller drives appear to be too noisy. One of the ideas is to provide the required take-off and flight power by the machine using the innovative ejector system. This kind of the system will be designed through a suitable choice of the three-dimensional geometric structure with special shape of nozzle in order to generate overpressure. The authors idea is to make a device that would allow to cumulate the overpressure using the a five-sided geometrical structure that will be limited on the one side by the blowing flow of air jet. In order to test this hypothesis a computer simulation study of aerodynamic drag of such 3D structures have been made. Based on the results of these studies, the tests on real model were also performed. The final stage of work was a comparative analysis of the results of simulation and real tests. The CFD simulation studies of air flow was conducted using the Star CD - Star Pro 3.2 software. The design of virtual model was made using the Catia v5 software. Apart from the objective to obtain advanced aviation propulsion system, all of the tests and modifications of 3D structures were also aimed at achieving high efficiency of this device while maintaining the ability to generate high value of overpressures. This was possible only in case of a large mass flow rate of air. All these aspects have been possible to verify using CFD methods for observing the flow of the working medium in the tested model. During the simulation tests, the distribution and size of pressure and velocity vectors were analyzed. Simulations were made with different boundary conditions (supply air pressure), but with a fixed external conditions (ambient temp., ambient pressure, etc.). The maximum value of obtained overpressure is 2 kPa. This value is too low to exploit the power of this device for the individual transport vehicle. Both the simulation model and real object shows a linear dependence of the overpressure values obtained from the different geometrical parameters of three-dimensional structures. Application of computational software greatly simplifies and streamlines the design and simulation capabilities. This work has been financed by the Polish Ministry of Science and Higher Education.

Keywords: aviation propulsion, CFD, 3d structure, aerodynamic drag

Procedia PDF Downloads 310
4740 Adjusted LOLE and EENS Indices for the Consideration of Load Excess Transfer in Power Systems Adequacy Studies

Authors: François Vallée, Jean-François Toubeau, Zacharie De Grève, Jacques Lobry

Abstract:

When evaluating the capacity of a generation park to cover the load in transmission systems, traditional Loss of Load Expectation (LOLE) and Expected Energy not Served (EENS) indices can be used. If those indices allow computing the annual duration and severity of load non-covering situations, they do not take into account the fact that the load excess is generally shifted from one penury state (hour or quarter of an hour) to the following one. In this paper, a sequential Monte Carlo framework is introduced in order to compute adjusted LOLE and EENS indices. Practically, those adapted indices permit to consider the effect of load excess transfer on the global adequacy of a generation park, providing thus a more accurate evaluation of this quantity.

Keywords: expected energy not served, loss of load expectation, Monte Carlo simulation, reliability, wind generation

Procedia PDF Downloads 410
4739 Investigation of the Corroded Steel Beam

Authors: Hesamaddin Khoshnoodi, Ahmad Rahbar Ranji

Abstract:

Corrosion in steel structures is one of the most important issues that should be considered in designing and constructing. Corrosion reduces the cross section and load capacity of element and leads to costly damage of structures. In this paper, the corrosion has been modeled for moment stresses. Moreover, the steel beam has been modeled using ABAQUS advanced finite element software. The conclusions of this study demonstrated that the displacement of the analyzed composite steel girder bridge might increase.

Keywords: Abaqus, Corrosion, deformation, Steel Beam

Procedia PDF Downloads 354
4738 Strengths and Weaknesses of Tally, an LCA Tool for Comparative Analysis

Authors: Jacob Seddlemeyer, Tahar Messadi, Hongmei Gu, Mahboobeh Hemmati

Abstract:

The main purpose of this first tier of the study is to quantify and compare the embodied environmental impacts associated with alternative materials applied to Adohi Hall, a residence building at the University of Arkansas campus, Fayetteville, AR. This 200,000square foot building has5 stories builtwith mass timber and is compared to another scenario where the same edifice is built with a steel frame. Based on the defined goal and scope of the project, the materials respectivetothe respective to the two building options are compared in terms of Global Warming Potential (GWP), starting from cradle to the construction site, which includes the material manufacturing stage (raw material extract, process, supply, transport, and manufacture) plus transportation to the site (module A1-A4, based on standard EN 15804 definition). The consumedfossil fuels and emitted CO2 associated with the buildings are the major reason for the environmental impacts of climate change. In this study, GWP is primarily assessed to the exclusion of other environmental factors. The second tier of this work is to evaluate Tally’s performance in the decision-making process through the design phases, as well as determine its strengths and weaknesses. Tally is a Life Cycle Assessment (LCA) tool capable of conducting a cradle-to-grave analysis. As opposed to other software applications, Tally is specifically targeted at buildings LCA. As a peripheral application, this software tool is directly run within the core modeling application platform called Revit. This unique functionality causes Tally to stand out from other similar tools in the building sector LCA analysis. The results of this study also provide insights for making more environmentally efficient decisions in the building environment and help in the move forward to reduce Green House Gases (GHGs) emissions and GWP mitigation.

Keywords: comparison, GWP, LCA, materials, tally

Procedia PDF Downloads 226
4737 Performance Analysis of Heterogeneous Cellular Networks with Multiple Connectivity

Authors: Sungkyung Kim, Jee-Hyeon Na, Dong-Seung Kwon

Abstract:

Future mobile networks following 5th generation will be characterized by one thousand times higher gains in capacity; connections for at least one hundred billion devices; user experience capable of extremely low latency and response times. To be close to the capacity requirements and higher reliability, advanced technologies have been studied, such as multiple connectivity, small cell enhancement, heterogeneous networking, and advanced interference and mobility management. This paper is focused on the multiple connectivity in heterogeneous cellular networks. We investigate the performance of coverage and user throughput in several deployment scenarios. Using the stochastic geometry approach, the SINR distributions and the coverage probabilities are derived in case of dual connection. Also, to compare the user throughput enhancement among the deployment scenarios, we calculate the spectral efficiency and discuss our results.

Keywords: heterogeneous networks, multiple connectivity, small cell enhancement, stochastic geometry

Procedia PDF Downloads 331
4736 In-door Localization Algorithm and Appropriate Implementation Using Wireless Sensor Networks

Authors: Adeniran K. Ademuwagun, Alastair Allen

Abstract:

The relationship dependence between RSS and distance in an enclosed environment is an important consideration because it is a factor that can influence the reliability of any localization algorithm founded on RSS. Several algorithms effectively reduce the variance of RSS to improve localization or accuracy performance. Our proposed algorithm essentially avoids this pitfall and consequently, its high adaptability in the face of erratic radio signal. Using 3 anchors in close proximity of each other, we are able to establish that RSS can be used as reliable indicator for localization with an acceptable degree of accuracy. Inherent in this concept, is the ability for each prospective anchor to validate (guarantee) the position or the proximity of the other 2 anchors involved in the localization and vice versa. This procedure ensures that the uncertainties of radio signals due to multipath effects in enclosed environments are minimized. A major driver of this idea is the implicit topological relationship among sensors due to raw radio signal strength. The algorithm is an area based algorithm; however, it does not trade accuracy for precision (i.e the size of the returned area).

Keywords: anchor nodes, centroid algorithm, communication graph, radio signal strength

Procedia PDF Downloads 508
4735 The Effectiveness of Synthesizing A-Pillar Structures in Passenger Cars

Authors: Chris Phan, Yong Seok Park

Abstract:

The Toyota Camry is one of the best-selling cars in America. It is economical, reliable, and most importantly, safe. These attributes allowed the Camry to be the trustworthy choice when choosing dependable vehicle. However, a new finding brought question to the Camry’s safety. Since 1997, the Camry received a “good” rating on its moderate overlap front crash test through the Insurance Institute of Highway Safety. In 2012, the Insurance Institute of Highway Safety introduced a frontal small overlap crash test into the overall evaluation of vehicle occupant safety test. The 2012 Camry received a “poor” rating on this new test, while the 2015 Camry redeemed itself with a “good” rating once again. This study aims to find a possible solution that Toyota implemented to reduce the severity of a frontal small overlap crash in the Camry during a mid-cycle update. The purpose of this study is to analyze and evaluate the performance of various A-pillar shapes as energy absorbing structures in improving passenger safety in a frontal crash. First, A-pillar structures of the 2012 and 2015 Camry were modeled using CAD software, namely SolidWorks. Then, a crash test simulation using ANSYS software, was applied to the A-pillars to analyze the behavior of the structures in similar conditions. Finally, the results were compared to safety values of cabin intrusion to determine the crashworthy behaviors of both A-pillar structures by measuring total deformation. This study highlights that it is possible that Toyota improved the shape of the A-pillar in the 2015 Camry in order to receive a “good” rating from the IIHS safety evaluation once again. These findings can possibly be used to increase safety performance in future vehicles to decrease passenger injury or fatality.

Keywords: A-pillar, Crashworthiness, Design Synthesis, Finite Element Analysis

Procedia PDF Downloads 119
4734 Development of an Instrument: The Contemporary Adolescent Well-Being Scale (CAWBS)

Authors: Camille Rault, Mark Bahr

Abstract:

The aim of the present study was to develop a contemporaneous instrument measuring adolescent’s subjective well-being (SWB). The instrument development underwent a three-phase pilot study. Phase one (N = 31) used a qualitative approach to generate domains of SWB relevant to adolescents. During the second phase (N = 22), items were tested targeting these domains. Finally, the third phase (N = 22) assisted in addition, deletion and refinement according to the first two phases of the pilot. A total of 49 items were retained for the final version of the instrument. The Contemporary Adolescent Well-Being Scale (CAWBS) was administered to 1071 school children (599 girls) aged between ten to 18 years old (M = 14,70; SD = 1.45) from Queensland, Australia. Results confirmed the seven-factor construct hypothesized and explained 45% of the variance. The questionnaire pertained to seven domains of adolescent’s SWB, namely; Overall life satisfaction; Bullying; Body image; Social connectedness; Activities; Control appraisal; and Negative feelings. Reliability was shown to be acceptable with Cronbach’s alpha ranging from .58 to .89. Future research should refine the CAWBS and investigate the psychometric properties of this instrument.

Keywords: adolescence, construct validity, instrument, subjective well-being

Procedia PDF Downloads 269
4733 Visitor's Perception toward Boating in Silver River, Florida

Authors: Hoda Manafian, Stephen Holland

Abstract:

Silver Springs are one of Florida's first tourist attractions. They are one of the largest artesian spring formations in the world, producing nearly 550 million gallons of crystal-clear water daily that is one of the most popular sites for water-based leisure activities. As part of managing the use of a state park, the state is interested in establishing a baseline count of number of boating users to compare this to the quality of the natural resources and environment in the park. Understanding the status of the environmental resources and also the human recreational experience is the main objective of the project. Two main goals of current study are 1) to identify the distribution of different types of watercrafts (kayak, canoe, motor boat, Jet Ski, paddleboard and pontoon). 2) To document the level of real crowdedness in the river during different seasons, months, and hours of each day based on the reliable information gained from camera versus self-reported method by tourists themselves in the past studies (the innovative achievement of this study). In line with these objectives, on-site surveys and also boat counting using a time-lapse camera at the Riverside launch was done during 12 months of 2015. 700 on-site surveys were conducted at three watercraft boat ramp sites (Rays Wayside, Riverside launch area, Ft. King Waterway) of recreational users. We used Virtualdub and ImageJ software for counting boats for meeting the first and second goals, since this two software can report even the hour of presence of watercraft in the water in addition to the number of users and the type of watercraft. The most crowded hours were between 9-11AM from February to May and kayak was the most popular watercraft. The findings of this research can make a good foundation for better management in this state park in future.

Keywords: eco-tourism, Florida state, visitors' perception, water-based recreation

Procedia PDF Downloads 247
4732 Towards Incorporating Context Awareness into Business Process Management

Authors: Xiaohui Zhao, Shahan Mafuz

Abstract:

Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviour, object movements, etc. Further, with such capability system applications can be smart to adapt intelligently their responses to the changing conditions. Concerning business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realizing such context-aware business process management, this paper firstly explores its potential benefit and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed with context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.

Keywords: business process adaptation, business process evolution, business process modelling, and context awareness

Procedia PDF Downloads 413
4731 Nonparametric Copula Approximations

Authors: Serge Provost, Yishan Zang

Abstract:

Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.

Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation

Procedia PDF Downloads 74
4730 Composite Distributed Generation and Transmission Expansion Planning Considering Security

Authors: Amir Lotfi, Seyed Hamid Hosseini

Abstract:

During the recent past, due to the increase of electrical energy demand and governmental resources constraints in creating additional capacity in the generation, transmission, and distribution, privatization, and restructuring in electrical industry have been considered. So, in most of the countries, different parts of electrical industry like generation, transmission, and distribution have been separated in order to create competition. Considering these changes, environmental issues, energy growth, investment of private equity in energy generation units and difficulties of transmission lines expansion, distributed generation (DG) units have been used in power systems. Moreover, reduction in the need for transmission and distribution, the increase of reliability, improvement of power quality, and reduction of power loss have caused DG to be placed in power systems. On the other hand, considering low liquidity need, private investors tend to spend their money for DGs. In this project, the main goal is to offer an algorithm for planning and placing DGs in order to reduce the need for transmission and distribution network.

Keywords: planning, transmission, distributed generation, power security, power systems

Procedia PDF Downloads 480
4729 Improving Security Features of Traditional Automated Teller Machines-Based Banking Services via Fingerprint Biometrics Scheme

Authors: Anthony I. Otuonye, Juliet N. Odii, Perpetual N. Ibe

Abstract:

The obvious challenges faced by most commercial bank customers while using the services of ATMs (Automated Teller Machines) across developing countries have triggered the need for an improved system with better security features. Current ATM systems are password-based, and research has proved the vulnerabilities of these systems to heinous attacks and manipulations. We have discovered by research that the security of current ATM-assisted banking services in most developing countries of the world is easily broken and maneuvered by fraudsters, majorly because it is quite difficult for these systems to identify an impostor with privileged access as against the authentic bank account owner. Again, PIN (Personal Identification Number) code passwords are easily guessed, just to mention a few of such obvious limitations of traditional ATM operations. In this research work also, we have developed a system of fingerprint biometrics with PIN code Authentication that seeks to improve the security features of traditional ATM installations as well as other Banking Services. The aim is to ensure better security at all ATM installations and raise the confidence of bank customers. It is hoped that our system will overcome most of the challenges of the current password-based ATM operation if properly applied. The researchers made use of the OOADM (Object-Oriented Analysis and Design Methodology), a software development methodology that assures proper system design using modern design diagrams. Implementation and coding were carried out using Visual Studio 2010 together with other software tools. Results obtained show a working system that provides two levels of security at the client’s side using a fingerprint biometric scheme combined with the existing 4-digit PIN code to guarantee the confidence of bank customers across developing countries.

Keywords: fingerprint biometrics, banking operations, verification, ATMs, PIN code

Procedia PDF Downloads 42
4728 A Review of HVDC Modular Multilevel Converters Subjected to DC and AC Faults

Authors: Jude Inwumoh, Adam P. R. Taylor, Kosala Gunawardane

Abstract:

Modular multilevel converters (MMC) exhibit a highly scalable and modular characteristic with good voltage/power expansion, fault tolerance capability, low output harmonic content, good redundancy, and a flexible front-end configuration. Fault detection, location, and isolation, as well as maintaining fault ride-through (FRT), are major challenges to MMC reliability and power supply sustainability. Different papers have been reviewed to seek the best MMC configuration with fault capability. DC faults are the most common fault, while the probability that AC fault occurs in a modular multilevel converter (MCC) is low; though, AC faults consequence are severe. This paper reviews several MMC topologies and modulation techniques in tackling faults. These fault control strategies are compared based on cost, complexity, controllability, and power loss. A meshed network of half-bridge (HB) MMC topology was optimal in rendering fault ride through than any other MMC topologies but only when combined with DC circuit breakers (CBS), AC CBS, and fault current limiters (FCL).

Keywords: MMC-HVDC, DC faults, fault current limiters, control scheme

Procedia PDF Downloads 139
4727 The Impact of Leadership Culture on Motivation, Efficiency, and Performance of Customs Employees: A Case Study of Iran Customs

Authors: Kazem Samadi

Abstract:

In today’s world, public agencies like customs have become vital institutions in international trade processes and in maintaining national economic security due to increasing economic and commercial complexities. In this regard, human resource management (HRM) is crucial to achieving organizational goals. This research employed a descriptive survey method, in which the statistical population consisted of all customs employees. Using Cochran's formula, 300 employees were selected from the central customs office. A researcher-made questionnaire was used as the data collection tool, with content validity and reliability confirmed using Cronbach's alpha coefficient. The collected data were analyzed through structural modeling using SPSS and AMOS 24. The results indicated that leadership culture significantly affected employee motivation, efficiency, and performance in customs. Customs managers and leaders in Iran can improve organizational productivity by fostering this culture, thereby facilitating individual and organizational development for their staff.

Keywords: leadership culture, organizational culture, employee performance, customs

Procedia PDF Downloads 19
4726 Investigation on the Energy Impact of Spatial Geometry in a Residential Building Using Building Information Modeling Technology

Authors: Shashank. S. Bagane, H. N. Rajendra Prasad

Abstract:

Building Information Modeling (BIM) has currently developed into a potent solution. The consistent development of BIM technology in the sphere of Architecture, Engineering, and Construction (AEC) industry has enhanced the effectiveness of construction and decision making. However, aggrandized global warming and energy crisis has impacted on building energy analysis. It is now becoming an important factor to be considered in the AEC industry. Amalgamating energy analysis in the planning and design phase of a structure has become a necessity. In the current construction industry, estimating energy usage and reducing its footprint is of high priority. The construction industry is giving more prominence to sustainability alongside energy efficiency. This demand is compelling the designers, planners, and engineers to inspect the sustainable performance throughout the building's life cycle. The current study primarily focuses on energy consumption, space arrangement, and spatial geometry of a residential building. Most commonly residential structures in India are constructed considering Vastu Shastra. Vastu designs are intended to integrate architecture with nature and utilizing geometric patterns, symmetry, and directional alignments. In the current study, a residential brick masonry structure is considered for BIM analysis, Architectural model of the structure will be created using Revit software, later the orientation and spatial arrangement will be finalized based on Vastu principles. Furthermore, the structure will be investigated for the impact of building orientation and spatial arrangements on energy using Green Building Studio software. Based on the BIM analysis of the structure, energy consumption of subsequent building orientations will be understood. A well-orientated building having good spatial arrangement can save a considerable amount of energy throughout its life cycle and reduces the need for heating and lighting which will prove to diminish energy usage and improve the energy efficiency of the residential building.

Keywords: building information modeling, energy impact, spatial geometry, vastu

Procedia PDF Downloads 161
4725 Influence of Degassing on the Curing Behaviour and Void Occurrence Properties of Epoxy / Anhydride Resin System

Authors: Latha Krishnan, Andrew Cobley

Abstract:

Epoxy resin is most widely used as matrices for composites of aerospace, automotive and electronic applications due to its outstanding mechanical properties. These properties are chiefly predetermined by the chemical structure of the prepolymer and type of hardener but can also be varied by the processing conditions such as prepolymer and hardener mixing, degassing and curing conditions. In this research, the effect of degassing on the curing behaviour and the void occurrence is experimentally evaluated for epoxy /anhydride resin system. The epoxy prepolymer was mixed with an anhydride hardener and accelerator in an appropriate quantity. In order to investigate the effect of degassing on the curing behaviour and void content of the resin, the uncured resin samples were prepared using three different methods: 1) no degassing 2) degassing on prepolymer and 3) degassing on mixed solution of prepolymer and hardener with an accelerator. The uncured resins were tested in differential scanning calorimeter (DSC) to observe the changes in curing behaviour of the above three resin samples by analysing factors such as gel temperature, peak cure temperature and heat of reaction/heat flow in curing. Additionally, the completely cured samples were tested in DSC to identify the changes in the glass transition temperature (Tg) between the three samples. In order to evaluate the effect of degassing on the void content and morphology changes in the cured epoxy resin, the fractured surfaces of cured epoxy resin were examined under the scanning electron microscope (SEM). In addition, the amount of void, void geometry and void fraction were also investigated using an optical microscope and image J software (image analysis software). It was found that degassing at different stages of resin mixing had significant effects on properties such as glass transition temperature, the void content and void size of the epoxy/anhydride resin system. For example, degassing (vacuum applied on the mixed resin) has shown higher glass transition temperature (Tg) with lower void content.

Keywords: anhydride epoxy, curing behaviour, degassing, void occurrence

Procedia PDF Downloads 216