Search results for: Long Chen
5470 Reducing the Negative Effects of Infrastructure Deficit through Continuity in Governance
Authors: Edoghogho Ogbeifun, Charles Mbohwa, J. H. C. Pretorius
Abstract:
Effective infrastructure development scheme planned and executed has positive influence on the quantity of available stock of infrastructure to meet the immediate and expansion needs of an organization, as well as contribute to the overall economic development of a nation, community or the local entity where the infrastructure is hosted. It is noteworthy, however, that infrastructure development scheme spans a long time frame, usually longer than the political life of the administration that initiates the scheme. In the majority of circumstances, execution may start and achieve different levels of completion; at best, only limited numbers are completed and put into functional use during the life of the administration that initiated the infrastructure scheme. When there is a change in leadership, many of the uncompleted projects are usually abandoned. The new administration repeats the circle of its predecessors and develops another set of infrastructure scheme which will suffer similar fate as the ones developed by their predecessors; thus doting the landscape with many uncompleted projects, which leads to infrastructure deficit. These circle will continue unless each succeeding leader sees governance as single continuum. Therefore, infrastructure projects not completed by one administration should be continued by succeeding administration, in order to increase the stock of relevant infrastructure available for the smooth operations organization and enhance the needed developments, as well as reduce the negative effects of infrastructure deficit. The single case study of qualitative research method was adopted to investigate the actions of the administration of three successive Vice-Chancellors, in a higher education institution in Nigeria, over a longitudinal period of twelve years. This is with a view to exploring the effects of each administration on the development and execution of infrastructure projects, with particular interest on abandoned projects. The findings revealed that although two of Vice-Chancellors were committed to infrastructure upgrade, they executed more new projects than completing abandoned ones, while the current leader has shown more pragmatism in completing abandoned projects alongside constructing new ones; thus demonstrating the importance of the continuity of governance. In this regard, there is a steady increase in the stock of infrastructure to accommodate the expansion of existing academic programmes, host new ones as well as reduce the negative effects of infrastructure deficit caused by abandoned projects.Keywords: abandoned projects, continuity of governance, infrastructure development scheme, long time frame
Procedia PDF Downloads 1835469 Health Risk Assessment of Trihalogenmethanes in Drinking Water
Authors: Lenka Jesonkova, Frantisek Bozek
Abstract:
Trihalogenmethanes (THMs) are disinfection byproducts with non-carcinogenic and genotoxic effects. The contamination of 6 sites close to the water treatment plant has been monitored in second largest city of the Czech Republic. Health risk assessment including both non-carcinogenic and genotoxic risk for long term exposition was realized using the critical concentrations. Concentrations of trihalogenmethanes met national standards in all samples. Risk assessment proved that health risks from trihalogenmethanes are acceptable on each site.Keywords: drinking water, health risk assessment, trihalogenmethanes, water pollution
Procedia PDF Downloads 5205468 Full Length Transcriptome Sequencing and Differential Expression Gene Analysis of Hybrid Larch under PEG Stress
Authors: Zhang Lei, Zhao Qingrong, Wang Chen, Zhang Sufang, Zhang Hanguo
Abstract:
Larch is the main afforestation and timber tree species in Northeast China, and drought is one of the main factors limiting the growth of Larch and other organisms in Northeast China. In order to further explore the mechanism of Larch drought resistance, PEG was used to simulate drought stress. The full-length sequencing of Larch embryogenic callus under PEG simulated drought stress was carried out by combining Illumina-Hiseq and SMRT-seq. A total of 20.3Gb clean reads and 786492 CCS reads were obtained from the second and third generation sequencing. The de-redundant transcript sequences were predicted by lncRNA, 2083 lncRNAs were obtained, and the target genes were predicted, and a total of 2712 target genes were obtained. The de-redundant transcripts were further screened, and 1654 differentially expressed genes (DEGs )were obtained. Among them, different DEGs respond to drought stress in different ways, such as oxidation-reduction process, starch and sucrose metabolism, plant hormone pathway, carbon metabolism, lignin catabolic/biosynthetic process and so on. This study provides basic full-length sequencing data for the study of Larch drought resistance, and excavates a large number of DEGs in response to drought stress, which helps us to further understand the function of Larch drought resistance genes and provides a reference for in-depth analysis of the molecular mechanism of Larch drought resistance.Keywords: larch, drought stress, full-length transcriptome sequencing, differentially expressed genes
Procedia PDF Downloads 1735467 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection
Authors: Rubin Dan, Xingcai Wang, Ziyang Chen
Abstract:
A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising
Procedia PDF Downloads 1995466 Play-Based Approaches to Stimulate Language
Authors: Sherri Franklin-Guy
Abstract:
The emergence of language in young children has been well-documented and play-based activities that support its continued development have been utilized in the clinic-based setting. Speech-language pathologists have long used such activities to stimulate the production of language in children with speech and language disorders via modeling and elicitation tasks. This presentation will examine the importance of play in the development of language in young children, including social and pragmatic communication. Implications for clinicians and educators will be discussed.Keywords: language development, language stimulation, play-based activities, symbolic play
Procedia PDF Downloads 2415465 The Research of the Relationship between Triathlon Competition Results with Physical Fitness Performance
Authors: Chen Chan Wei
Abstract:
The purpose of this study was to investigate the impact of swim 1500m, 10000m run, VO2 max, and body fat on Olympic distance triathlon competition performance. The subjects were thirteen college triathletes with endurance training, with an average age, height and weight of 20.61±1.04 years (mean ± SD), 171.76±8.54 cm and 65.32±8.14 kg respectively. All subjects were required to take the tests of swim 1500m, run 10000m, VO2 max, body fat, and participate in the Olympic distance triathlon competition. First, the swim 1500m test was taken in the standardized 50m pool, with a depth of 2m, and the 10000m run test on the standardized 400m track. After three days, VO2 max was tested with the MetaMax 3B and body fat was measured with the DEXA machine. After two weeks, all 13 subjects joined the Olympic distance triathlon competition at the 2016 New Taipei City Asian Cup. The relationships between swim 1500m, 10000m run, VO2 max, body fat test, and Olympic distance triathlon competition performance were evaluated using Pearson's product-moment correlation. The results show that 10000m run and body fat had a significant positive correlation with Olympic distance triathlon performance (r=.830, .768), but VO2 max has a significant negative correlation with Olympic distance triathlon performance (r=-.735). In conclusion, for improved non-draft Olympic distance triathlon performance, triathletes should focus on running than swimming training and can be measure VO2 max to prediction triathlon performance. Also, managing body fat can improve Olympic distance triathlon performance. In addition, swimming performance was not significantly correlated to Olympic distance triathlon performance, possibly because the 2016 New Taipei City Asian Cup age group was not a drafting competition. The swimming race is the shortest component of Olympic distance triathlons. Therefore, in a non-draft competition, swimming ability is not significantly correlated with overall performance.Keywords: triathletes, olympic, non-drafting, correlation
Procedia PDF Downloads 2505464 Environmental Effect on Corrosion Fatigue Behaviors of Steam Generator Forging in Simulated Pressurized Water Reactor Environment
Authors: Yakui Bai, Chen Sun, Ke Wang
Abstract:
An experimental investigation of environmental effect on fatigue behavior in SA508 Gr.3 Cl.2 Steam Generator Forging CAP1400 nuclear power plant has been carried out. In order to simulate actual loading condition, a range of strain amplitude was applied in different low cycle fatigue (LCF) tests. The current American Society of Mechanical Engineers (ASME) design fatigue code does not take full account of the interactions of environmental, loading, and material's factors. A range of strain amplitude was applied in different low cycle fatigue (LCF) tests at a strain rate of 0.01%s⁻¹. A design fatigue model was constructed by taking environmentally assisted fatigue effects into account, and the corresponding design curves were given for the convenience of engineering applications. The corrosion fatigue experiment was performed in a strain control mode in 320℃ borated and lithiated water environment to evaluate the effects of a mixed environment on fatigue life. Stress corrosion cracking (SCC) in steam generator large forging in primary water of pressurized water reactor was also observed. In addition, it is found that the CF life of SA508 Gr.3 Cl.2 decreases with increasing temperature in the water environment. The relationship between the reciprocal of temperature and the logarithm of fatigue life was found to be linear. Through experiments and subsequent analysis, the mechanisms of reduced low cycle fatigue life have been investigated for steam generator forging.Keywords: failure behavior, low alloy steel, steam generator forging, stress corrosion cracking
Procedia PDF Downloads 1255463 Quorum-Sensing Driven Inhibitors for Mitigating Microbial Influenced Corrosion
Authors: Asma Lamin, Anna H. Kaksonen, Ivan Cole, Paul White, Xiao-Bo Chen
Abstract:
Microbiologically influenced corrosion (MIC) is a process in which microorganisms initiate, facilitate, or accelerate the electrochemical corrosion reactions of metallic components. Several reports documented that MIC accounts for about 20 to 40 % of the total cost of corrosion. Biofilm formation due to the presence of microorganisms on the surface of metal components is known to play a vital role in MIC, which can lead to severe consequences in various environmental and industrial settings. Quorum sensing (QS) system plays a major role in regulating biofilm formation and control the expression of some microbial enzymes. QS is a communication mechanism between microorganisms that involves the regulation of gene expression as a response to the microbial cell density within an environment. This process is employed by both Gram-positive and Gram-negative bacteria to regulate different physiological functions. QS involves production, detection, and responses to signalling chemicals, known as auto-inducers. QS controls specific processes important for the microbial community, such as biofilm formation, virulence factor expression, production of secondary metabolites and stress adaptation mechanisms. The use of QS inhibitors (QSIs) has been proposed as a possible solution to biofilm related challenges in many different applications. Although QSIs have demonstrated some strength in tackling biofouling, QSI-based strategies to control microbially influenced corrosion have not been thoroughly investigated. As such, our research aims to target the QS mechanisms as a strategy for mitigating MIC on metal surfaces in engineered systems.Keywords: quorum sensing, quorum quenching, biofilm, biocorrosion
Procedia PDF Downloads 905462 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines
Authors: Kamyar Tolouei, Ehsan Moosavi
Abstract:
In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization
Procedia PDF Downloads 1055461 Feigenbaum Universality, Chaos and Fractal Dimensions in Discrete Dynamical Systems
Authors: T. K. Dutta, K. K. Das, N. Dutta
Abstract:
The salient feature of this paper is primarily concerned with Ricker’s population model: f(x)=x e^(r(1-x/k)), where r is the control parameter and k is the carrying capacity, and some fruitful results are obtained with the following objectives: 1) Determination of bifurcation values leading to a chaotic region, 2) Development of Statistical Methods and Analysis required for the measure of Fractal dimensions, 3) Calculation of various fractal dimensions. These results also help that the invariant probability distribution on the attractor, when it exists, provides detailed information about the long-term behavior of a dynamical system. At the end, some open problems are posed for further research.Keywords: Feigenbaum universality, chaos, Lyapunov exponent, fractal dimensions
Procedia PDF Downloads 3025460 An Intelligent Cloud Radio Access Network (RAN) Architecture for Future 5G Heterogeneous Wireless Network
Authors: Jin Xu
Abstract:
5G network developers need to satisfy the necessary requirements of additional capacity from massive users and spectrally efficient wireless technologies. Therefore, the significant amount of underutilized spectrum in network is motivating operators to combine long-term evolution (LTE) with intelligent spectrum management technology. This new LTE intelligent spectrum management in unlicensed band (LTE-U) has the physical layer topology to access spectrum, specifically the 5-GHz band. We proposed a new intelligent cloud RAN for 5G.Keywords: cloud radio access network, wireless network, cloud computing, multi-agent
Procedia PDF Downloads 4245459 Toward a Measure of Appropriateness of User Interfaces Adaptations Solutions
Authors: Abderrahim Siam, Ramdane Maamri, Zaidi Sahnoun
Abstract:
The development of adaptive user interfaces (UI) presents for a long time an important research area in which researcher attempt to call upon the full resources and skills of several disciplines. The adaptive UI community holds a thorough knowledge regarding the adaptation of UIs with users and with contexts of use. Several solutions, models, formalisms, techniques, and mechanisms were proposed to develop adaptive UI. In this paper, we propose an approach based on the fuzzy set theory for modeling the concept of the appropriateness of different solutions of UI adaptation with different situations for which interactive systems have to adapt their UIs.Keywords: adaptive user interfaces, adaptation solution’s appropriateness, fuzzy sets
Procedia PDF Downloads 4885458 Stakeholder Perceptions of Wildlife Tourism in Communal Conservancies within the Mudumu North Complex, Zambezi Region, Namibia
Authors: Shimhanda M. N., Mogomotsi P. K., Thakadu O. T., Rutina L. P.
Abstract:
Wildlife tourism (WT) in communal conservancies has the potential to contribute significantly to sustainable rural development. However, understanding local perceptions, promoting participation, and addressing stakeholder concerns are all required for sustainability. This study looks at stakeholder perceptions of WT in conservancies near protected areas in Namibia's Zambezi region, specifically the Mudumu North Complex. A mixed-methods approach was employed to collect data from 356 households using stratified sampling. Qualitative data was gathered through six focus group discussions and 22 key informant interviews. Quantitative analysis, using descriptive statistics and Spearman correlation, investigated socio-demographic influences on WT perceptions, while qualitative data were subjected to thematic analysis to identify key themes. Results revealed high awareness and generally positive perceptions of WT, particularly in Mashi Conservancy, which benefits from diverse tourism activities and joint ventures with lodges. Kwandu and Kyaramacan, which rely heavily on consumptive tourism, had lower awareness and perceived benefits. Human-wildlife conflict emerged as a persistent issue, especially in Kwandu and Mashi, where crop damage and wildlife interference undermined community support for WT. Younger, more educated, and employed individuals held more positive attitudes towards WT. The study highlights the importance of recognising community heterogeneity and tailoring WT strategies to meet diverse needs, including HWC mitigation. Policy implications include increasing community engagement, ensuring equitable benefit distribution, and implementing inclusive tourism strategies that promote long-term sustainability. These findings are critical for developing long-term WT models that address local challenges, encourage community participation, and contribute to socioeconomic development and conservation goals.Keywords: sustainable tourism, stakeholder perceptions, community involvement, socio-economic development
Procedia PDF Downloads 175457 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series
Procedia PDF Downloads 1435456 Optimal Sequential Scheduling of Imperfect Maintenance Last Policy for a System Subject to Shocks
Authors: Yen-Luan Chen
Abstract:
Maintenance has a great impact on the capacity of production and on the quality of the products, and therefore, it deserves continuous improvement. Maintenance procedure done before a failure is called preventive maintenance (PM). Sequential PM, which specifies that a system should be maintained at a sequence of intervals with unequal lengths, is one of the commonly used PM policies. This article proposes a generalized sequential PM policy for a system subject to shocks with imperfect maintenance and random working time. The shocks arrive according to a non-homogeneous Poisson process (NHPP) with varied intensity function in each maintenance interval. As a shock occurs, the system suffers two types of failures with number-dependent probabilities: type-I (minor) failure, which is rectified by a minimal repair, and type-II (catastrophic) failure, which is removed by a corrective maintenance (CM). The imperfect maintenance is carried out to improve the system failure characteristic due to the altered shock process. The sequential preventive maintenance-last (PML) policy is defined as that the system is maintained before any CM occurs at a planned time Ti or at the completion of a working time in the i-th maintenance interval, whichever occurs last. At the N-th maintenance, the system is replaced rather than maintained. This article first takes up the sequential PML policy with random working time and imperfect maintenance in reliability engineering. The optimal preventive maintenance schedule that minimizes the mean cost rate of a replacement cycle is derived analytically and determined in terms of its existence and uniqueness. The proposed models provide a general framework for analyzing the maintenance policies in reliability theory.Keywords: optimization, preventive maintenance, random working time, minimal repair, replacement, reliability
Procedia PDF Downloads 2755455 Impact of the Hayne Royal Commission on the Operating Model of Australian Financial Advice Firms
Authors: Mohammad Abu-Taleb
Abstract:
The final report of the Royal Commission into Australian financial services misconduct, released in February 2019, has had a significant impact on the financial advice industry. The recommendations released in the Commissioner’s final report include changes to ongoing fee arrangements, a new disciplinary system for financial advisers, and mandatory reporting of compliance concerns. This thesis aims to explore the impact of the Royal Commission’s recommendations on the operating model of financial advice firms in terms of advice products, processes, delivery models, and customer segments. Also, this research seeks to investigate whether the Royal Commission’s outcome has accelerated the use of enhanced technology solutions within the operating model of financial advice firms. And to identify the key challenges confronting financial advice firms whilst implementing the Commissioner’s recommendations across their operating models. In order to achieve the objectives of this thesis, a qualitative research design has been adopted through semi-structured in-depth interviews with 24 financial advisers and managers who are engaged in the operation of financial advice services. The study used the thematic analysis approach to interpret the qualitative data collected from the interviews. The findings of this thesis reveal that customer-centric operating models will become more prominent across the financial advice industry in response to the Commissioner’s final report. And the Royal Commission’s outcome has accelerated the use of advice technology solutions within the operating model of financial advice firms. In addition, financial advice firms have started more than before using simpler and more automated web-based advice services, which enable financial advisers to provide simple advice in a greater scale, and also to accelerate the use of robo-advice models and digital delivery to mass customers in the long term. Furthermore, the study identifies process and technology changes as, long with technical and interpersonal skills development, as the key challenges encountered financial advice firms whilst implementing the Commissioner’s recommendations across their operating models.Keywords: hayne royal commission, financial planning advice, operating model, advice products, advice processes, delivery models, customer segments, digital advice solutions
Procedia PDF Downloads 885454 Filtering Momentum Life Cycles, Price Acceleration Signals and Trend Reversals for Stocks, Credit Derivatives and Bonds
Authors: Periklis Brakatsoulas
Abstract:
Recent empirical research shows a growing interest in investment decision-making under market anomalies that contradict the rational paradigm. Momentum is undoubtedly one of the most robust anomalies in the empirical asset pricing research and remains surprisingly lucrative ever since first documented. Although predominantly phenomena identified across equities, momentum premia are now evident across various asset classes. Yet few many attempts are made so far to provide traders a diversified portfolio of strategies across different assets and markets. Moreover, literature focuses on patterns from past returns rather than mechanisms to signal future price directions prior to momentum runs. The aim of this paper is to develop a diversified portfolio approach to price distortion signals using daily position data on stocks, credit derivatives, and bonds. An algorithm allocates assets periodically, and new investment tactics take over upon price momentum signals and across different ranking groups. We focus on momentum life cycles, trend reversals, and price acceleration signals. The main effort here concentrates on the density, time span and maturity of momentum phenomena to identify consistent patterns over time and measure the predictive power of buy-sell signals generated by these anomalies. To tackle this, we propose a two-stage modelling process. First, we generate forecasts on core macroeconomic drivers. Secondly, satellite models generate market risk forecasts using the core driver projections generated at the first stage as input. Moreover, using a combination of the ARFIMA and FIGARCH models, we examine the dependence of consecutive observations across time and portfolio assets since long memory behavior in volatilities of one market appears to trigger persistent volatility patterns across other markets. We believe that this is the first work that employs evidence of volatility transmissions among derivatives, equities, and bonds to identify momentum life cycle patterns.Keywords: forecasting, long memory, momentum, returns
Procedia PDF Downloads 1025453 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators
Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros
Abstract:
Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis
Procedia PDF Downloads 1395452 Efficiency and Scale Elasticity in Network Data Envelopment Analysis: An Application to International Tourist Hotels in Taiwan
Authors: Li-Hsueh Chen
Abstract:
Efficient operation is more and more important for managers of hotels. Unlike the manufacturing industry, hotels cannot store their products. In addition, many hotels provide room service, and food and beverage service simultaneously. When efficiencies of hotels are evaluated, the internal structure should be considered. Hence, based on the operational characteristics of hotels, this study proposes a DEA model to simultaneously assess the efficiencies among the room production division, food and beverage production division, room service division and food and beverage service division. However, not only the enhancement of efficiency but also the adjustment of scale can improve the performance. In terms of the adjustment of scale, scale elasticity or returns to scale can help to managers to make decisions concerning expansion or contraction. In order to construct a reasonable approach to measure the efficiencies and scale elasticities of hotels, this study builds an alternative variable-returns-to-scale-based two-stage network DEA model with the combination of parallel and series structures to explore the scale elasticities of the whole system, room production division, food and beverage production division, room service division and food and beverage service division based on the data of international tourist hotel industry in Taiwan. The results may provide valuable information on operational performance and scale for managers and decision makers.Keywords: efficiency, scale elasticity, network data envelopment analysis, international tourist hotel
Procedia PDF Downloads 2255451 Consumer Choice Determinants in Context of Functional Food
Authors: E. Grochowska-Niedworok, K. Brukało, M. Kardas
Abstract:
The aim of this study was to analyze and evaluate the consumption of functional food by consumers by: age, sex, formal education level, place of residence and diagnosed diseases. The study employed an ad hoc questionnaire in a group of 300 inhabitants of Upper Silesia voivodship. Knowledge of functional food among the group covered in the study was far from satisfactory. The choice of functional food was of intuitive character. In addition, the group covered was more likely to choose pharmacotherapy instead of diet-related prevention then, which can be associated with presumption of too distant effects and a long period of treatment.Keywords: consumer choice, functional food, healthy lifestyle, consumer knowledge
Procedia PDF Downloads 2565450 Evaluating a Holistic Fitness Program Used by High Performance Athletes and Mass Participants
Authors: Peter Smolianov, Jed Smith, Lisa Chen, Steven Dion, Christopher Schoen, Jaclyn Norberg
Abstract:
This study evaluated the effectiveness of an experimental training program used to improve performance and health of competitive athletes and recreational sport participants. This holistic program integrated and advanced Eastern and Western methods of prolonging elite sports participation and enjoying lifelong fitness, particularly from China, India, Russia, and the United States. The program included outdoor, gym, and water training approaches focused on strengthening while stretching/decompressing and on full body activation-all in order to improve performance as well as treat and prevent common disorders and pains. The study observed and surveyed over 100 users of the program including recreational fitness and sports enthusiasts as well as elite athletes who competed for national teams of different countries and for Division I teams of National Collegiate Athletic Association in the United States. Different types of sport were studied, including territorial games (e.g., American football, basketball, volleyball), endurance/cyclical (athletics/track and field, swimming), and artistic (e.g., gymnastics and synchronized swimming). Results of the study showed positive effects on the participants’ performance and health, particularly for those who used the program for more than two years and especially in reducing spinal disorders and in enabling to perform new training tasks which previously caused back pain.Keywords: lifelong fitness, injury prevention, prolonging sport participation, improving performance and health
Procedia PDF Downloads 1555449 SCM Challenges and Opportunities in the Timber Construction Sector
Authors: K. Reitner, F. Staberhofer, W. Ortner, M. Gerschberger
Abstract:
The purpose of this paper is to identify the main challenges faced by companies in the timber construction sector and to provide improvement opportunities that can be implemented on a short-, medium- and long-term basis. To identify the challenges and propose actions for each company a literature review and a multiple case research were conducted using the Quick Scan Audit Methodology. Finally, the findings and outcomes are compared with each other to support companies in the timer construction sector when implementing and restructuring their day-to-day activities.Keywords: supply chain management, supply chain challenges and opportunities, timber construction sector
Procedia PDF Downloads 2495448 Contingent Presences in Architecture: Vitruvian Theory as a Beginning
Authors: Zelal Çınar
Abstract:
This paper claims that architecture is a contingent discipline, despite the fact that its contingency has long been denied through a retreat to Vitruvian writing. It is evident that contingency is rejected not only by architecture but also by modernity as a whole. Vitruvius attempted to cover the entire field of architecture in a systematic form in order to bring the whole body of this great discipline to a complete order. The legacy of his theory hitherto lasted not only that it is the only major work on the architecture of Classical Antiquity to have survived, but also that its conformity with the project of modernity. In the scope of the paper, it will be argued that contingency should be taken into account rather than avoided as a potential threat.Keywords: architecture, contingency, modernity, Vitruvius
Procedia PDF Downloads 2955447 Processing Design of Miniature Casting Incorporating Stereolithography Technologies
Authors: Pei-Hsing Huang, Wei-Ju Huang
Abstract:
Investment casting is commonly used in the production of metallic components with complex shapes, due to its high dimensional precision, good surface finish, and low cost. However, the process is cumbersome, and the period between trial casting and final production can be very long, thereby limiting business opportunities and competitiveness. In this study, we replaced conventional wax injection with stereolithography (SLA) 3D printing to speed up the trial process and reduce costs. We also used silicone molds to further reduce costs to avoid the high costs imposed by photosensitive resin.Keywords: investment casting, stereolithography, wax molding, 3D printing
Procedia PDF Downloads 4045446 The System Dynamics Research of China-Africa Trade, Investment and Economic Growth
Authors: Emma Serwaa Obobisaa, Haibo Chen
Abstract:
International trade and outward foreign direct investment are important factors which are generally recognized in the economic growth and development. Though several scholars have struggled to reveal the influence of trade and outward foreign direct investment (FDI) on economic growth, most studies utilized common econometric models such as vector autoregression and aggregated the variables, which for the most part prompts, however, contradictory and mixed results. Thus, there is an exigent need for the precise study of the trade and FDI effect of economic growth while applying strong econometric models and disaggregating the variables into its separate individual variables to explicate their respective effects on economic growth. This will guarantee the provision of policies and strategies that are geared towards individual variables to ensure sustainable development and growth. This study, therefore, seeks to examine the causal effect of China-Africa trade and Outward Foreign Direct Investment on the economic growth of Africa using a robust and recent econometric approach such as system dynamics model. Our study impanels and tests an ensemble of a group of vital variables predominant in recent studies on trade-FDI-economic growth causality: Foreign direct ınvestment, international trade and economic growth. Our results showed that the system dynamics method provides accurate statistical inference regarding the direction of the causality among the variables than the conventional method such as OLS and Granger Causality predominantly used in the literature as it is more robust and provides accurate, critical values.Keywords: economic growth, outward foreign direct investment, system dynamics model, international trade
Procedia PDF Downloads 1085445 qPCR Method for Detection of Halal Food Adulteration
Authors: Gabriela Borilova, Monika Petrakova, Petr Kralik
Abstract:
Nowadays, European producers are increasingly interested in the production of halal meat products. Halal meat has been increasingly appearing in the EU's market network and meat products from European producers are being exported to Islamic countries. Halal criteria are mainly related to the origin of muscle used in production, and also to the way products are obtained and processed. Although the EU has legislatively addressed the question of food authenticity, the circumstances of previous years when products with undeclared horse or poultry meat content appeared on EU markets raised the question of the effectiveness of control mechanisms. Replacement of expensive or not-available types of meat for low-priced meat has been on a global scale for a long time. Likewise, halal products may be contaminated (falsified) by pork or food components obtained from pigs. These components include collagen, offal, pork fat, mechanically separated pork, emulsifier, blood, dried blood, dried blood plasma, gelatin, and others. These substances can influence sensory properties of the meat products - color, aroma, flavor, consistency and texture or they are added for preservation and stabilization. Food manufacturers sometimes access these substances mainly due to their dense availability and low prices. However, the use of these substances is not always declared on the product packaging. Verification of the presence of declared ingredients, including the detection of undeclared ingredients, are among the basic control procedures for determining the authenticity of food. Molecular biology methods, based on DNA analysis, offer rapid and sensitive testing. The PCR method and its modification can be successfully used to identify animal species in single- and multi-ingredient raw and processed foods and qPCR is the first choice for food analysis. Like all PCR-based methods, it is simple to implement and its greatest advantage is the absence of post-PCR visualization by electrophoresis. qPCR allows detection of trace amounts of nucleic acids, and by comparing an unknown sample with a calibration curve, it can also provide information on the absolute quantity of individual components in the sample. Our study addresses a problem that is related to the fact that the molecular biological approach of most of the work associated with the identification and quantification of animal species is based on the construction of specific primers amplifying the selected section of the mitochondrial genome. In addition, the sections amplified in conventional PCR are relatively long (hundreds of bp) and unsuitable for use in qPCR, because in DNA fragmentation, amplification of long target sequences is quite limited. Our study focuses on finding a suitable genomic DNA target and optimizing qPCR to reduce variability and distortion of results, which is necessary for the correct interpretation of quantification results. In halal products, the impact of falsification of meat products by the addition of components derived from pigs is all the greater that it is not just about the economic aspect but above all about the religious and social aspect. This work was supported by the Ministry of Agriculture of the Czech Republic (QJ1530107).Keywords: food fraud, halal food, pork, qPCR
Procedia PDF Downloads 2475444 Bubble Scrum: How to Run in Organizations That Only Know How to Walk
Authors: Zaheer A. Ali, George Szabo
Abstract:
SCRUM has roots in software and web development and works very well on that in that space. However, any technical person who has watched a typical waterfall managed project spiral out of control or into an abyss, has thought: "there must be a better way". I will discuss how that thought leads naturally to adopting Agile principles and SCRUM, as well as how Agile and SCRUM can be implemented in large institutions with long histories via a method I developed: Bubble Scrum. We will also see how SCRUM can be implemented in interesting places outside of the technical sphere and also discuss where and how to subtly bring Agility and SCRUM into large, rigid, institutions.Keywords: agile, enterprise-agile, agile at scale, agile transition, project management, scrum
Procedia PDF Downloads 1625443 Production of New Hadron States in Effective Field Theory
Authors: Qi Wu, Dian-Yong Chen, Feng-Kun Guo, Gang Li
Abstract:
In the past decade, a growing number of new hadron states have been observed, which are dubbed as XYZ states in the heavy quarkonium mass regions. In this work, we present our study on the production of some new hadron states. In particular, we investigate the processes Υ(5S,6S)→ Zb (10610)/Zb (10650)π, Bc→ Zc (3900)/Zc (4020)π and Λb→ Pc (4312)/Pc (4440)/Pc (4457)K. (1) For the production of Zb (10610)/Zb (10650) from Υ(5S,6S) decay, two types of bottom-meson loops were discussed within a nonrelativistic effective field theory. We found that the loop contributions with all intermediate states being the S-wave ground state bottom mesons are negligible, while the loops with one bottom meson being the broad B₀* or B₁' resonance could provide the dominant contributions to the Υ(5S)→ Zb⁽'⁾ π. (2) For the production of Zc (3900)/Zc (4020) from Bc decay, the branching ratios of Bc⁺→ Z (3900)⁺ π⁰ and Bc⁺→ Zc (4020)⁺ π⁰ are estimated to be of order of 10⁽⁻⁴⁾ and 10⁽⁻⁷⁾ in an effective Lagrangian approach. The large production rate of Zc (3900) could provide an important source of the production of Zc (3900) from the semi-exclusive decay of b-flavored hadrons reported by D0 Collaboration, which can be tested by the exclusive measurements in LHCb. (3) For the production of Pc (4312), Pc (4440) and Pc (4457) from Λb decay, the ratio of the branching fraction of Λb→ Pc K was predicted in a molecular scenario by using an effective Lagrangian approach, which is weakly dependent on our model parameter. We also find the ratios of the productions of the branching fractions of Λb→ Pc K and Pc→ J/ψ p can be well interpreted in the molecular scenario. Moreover, the estimated branching fractions of Λb→ Pc K are of order 10⁽⁻⁶⁾, which could be tested by further measurements in LHCb Collaboration.Keywords: effective Lagrangian approach, hadron loops, molecular states, new hadron states
Procedia PDF Downloads 1325442 Continuance Commitment of Retail Pharmacist in a Labor Shortage: Results from the Questionnaire Survey
Authors: Shigeaki Mishima
Abstract:
Pharmacist labor shortage has become a long-term problem in Japan. This paper discusses the relationship between organizational commitment and pharmacists' organizational behavior in the context of labor shortage. Based on a multidimensional view of organizational commitment, effective commitment and continuous commitment are measured. It is suggested that the continuous commitment has a unique impact on withholding information behavior. We also discuss the impact of labor supply and demand on continuous commitment of retail pharmacist.Keywords: organizational commitment, pharmacist, labor shortage, professional
Procedia PDF Downloads 4095441 Applying Kinect on the Development of a Customized 3D Mannequin
Authors: Shih-Wen Hsiao, Rong-Qi Chen
Abstract:
In the field of fashion design, 3D Mannequin is a kind of assisting tool which could rapidly realize the design concepts. While the concept of 3D Mannequin is applied to the computer added fashion design, it will connect with the development and the application of design platform and system. Thus, the situation mentioned above revealed a truth that it is very critical to develop a module of 3D Mannequin which would correspond with the necessity of fashion design. This research proposes a concrete plan that developing and constructing a system of 3D Mannequin with Kinect. In the content, ergonomic measurements of objective human features could be attained real-time through the implement with depth camera of Kinect, and then the mesh morphing can be implemented through transformed the locations of the control-points on the model by inputting those ergonomic data to get an exclusive 3D mannequin model. In the proposed methodology, after the scanned points from the Kinect are revised for accuracy and smoothening, a complete human feature would be reconstructed by the ICP algorithm with the method of image processing. Also, the objective human feature could be recognized to analyze and get real measurements. Furthermore, the data of ergonomic measurements could be applied to shape morphing for the division of 3D Mannequin reconstructed by feature curves. Due to a standardized and customer-oriented 3D Mannequin would be generated by the implement of subdivision, the research could be applied to the fashion design or the presentation and display of 3D virtual clothes. In order to examine the practicality of research structure, a system of 3D Mannequin would be constructed with JAVA program in this study. Through the revision of experiments the practicability-contained research result would come out.Keywords: 3D mannequin, kinect scanner, interactive closest point, shape morphing, subdivision
Procedia PDF Downloads 306