Search results for: decision distance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5904

Search results for: decision distance

4164 Factors Affecting the Adoption of Cloud Business Intelligence among Healthcare Sector: A Case Study of Saudi Arabia

Authors: Raed Alsufyani, Hissam Tawfik, Victor Chang, Muthu Ramachandran

Abstract:

This study investigates the factors that influence the decision by players in the healthcare sector to embrace Cloud Business Intelligence Technology with a focus on healthcare organizations in Saudi Arabia. To bring this matter into perspective, this study primarily considers the Technology-Organization-Environment (TOE) framework and the Human Organization-Technology (HOT) fit model. A survey was hypothetically designed based on literature review and was carried out online. Quantitative data obtained was processed from descriptive and one-way frequency statistics to inferential and regression analysis. Data were analysed to establish factors that influence the decision to adopt Cloud Business intelligence technology in the healthcare sector. The implication of the identified factors was measured, and all assumptions were tested. 66.70% of participants in healthcare organization backed the intention to adopt cloud business intelligence system. 99.4% of these participants considered security concerns and privacy risk have been the most significant factors in the adoption of cloud Business Intelligence (CBI) system. Through regression analysis hypothesis testing point that usefulness, service quality, relative advantage, IT infrastructure preparedness, organization structure; vendor support, perceived technical competence, government support, and top management support positively and significantly influence the adoption of (CBI) system. The paper presents quantitative phase that is a part of an on-going project. The project will be based on the consequences learned from this study.

Keywords: cloud computing, business intelligence, HOT-fit model, TOE, healthcare and innovation adoption

Procedia PDF Downloads 170
4163 An Initial Assessment of the Potential Contibution of 'Community Empowerment' to Mitigating the Drivers of Deforestation and Forest Degradation, in Giam Siak Kecil-Bukit Batu Biosphere Reserve

Authors: Arzyana Sunkar, Yanto Santosa, Siti Badriyah Rushayati

Abstract:

Indonesia has experienced annual forest fires that have rapidly destroyed and degraded its forests. Fires in the peat swamp forests of Riau Province, have set the stage for problems to worsen, this being the ecosystem most prone to fires (which are also the most difficult, to extinguish). Despite various efforts to curb deforestation, and forest degradation processes, severe forest fires are still occurring. To find an effective solution, the basic causes of the problems must be identified. It is therefore critical to have an in-depth understanding of the underlying causal factors that have contributed to deforestation and forest degradation as a whole, in order to attain reductions in their rates. An assessment of the drivers of deforestation and forest degradation was carried out, in order to design and implement measures that could slow these destructive processes. Research was conducted in Giam Siak Kecil–Bukit Batu Biosphere Reserve (GSKBB BR), in the Riau Province of Sumatera, Indonesia. A biosphere reserve was selected as the study site because such reserves aim to reconcile conservation with sustainable development. A biosphere reserve should promote a range of local human activities, together with development values that are in line spatially and economically with the area conservation values, through use of a zoning system. Moreover, GSKBB BR is an area with vast peatlands, and is experiencing forest fires annually. Various factors were analysed to assess the drivers of deforestation and forest degradation in GSKBB BR; data were collected from focus group discussions with stakeholders, key informant interviews with key stakeholders, field observation and a literature review. Landsat satellite imagery was used to map forest-cover changes for various periods. Analysis of landsat images, taken during the period 2010-2014, revealed that within the non-protected area of core zone, there was a trend towards decreasing peat swamp forest areas, increasing land clearance, and increasing areas of community oil-palm and rubber plantations. Fire was used for land clearing and most of the forest fires occurred in the most populous area (the transition area). The study found a relationship between the deforested/ degraded areas, and certain distance variables, i.e. distance from roads, villages and the borders between the core area and the buffer zone. The further the distance from the core area of the reserve, the higher was the degree of deforestation and forest degradation. Research findings suggested that agricultural expansion may be the direct cause of deforestation and forest degradation in the reserve, whereas socio-economic factors were the underlying driver of forest cover changes; such factors consisting of a combination of socio-cultural, infrastructural, technological, institutional (policy and governance), demographic (population pressure) and economic (market demand) considerations. These findings indicated that local factors/problems were the critical causes of deforestation and degradation in GSKBB BR. This research therefore concluded that reductions in deforestation and forest degradation in GSKBB BR could be achieved through ‘local actor’-tailored approaches such as community empowerment

Keywords: Actor-led solution, community empowerment, drivers of deforestation and forest degradation, Giam Siak Kecil – Bukit Batu Biosphere Reserve

Procedia PDF Downloads 348
4162 Customer Churn Prediction by Using Four Machine Learning Algorithms Integrating Features Selection and Normalization in the Telecom Sector

Authors: Alanoud Moraya Aldalan, Abdulaziz Almaleh

Abstract:

A crucial component of maintaining a customer-oriented business as in the telecom industry is understanding the reasons and factors that lead to customer churn. Competition between telecom companies has greatly increased in recent years. It has become more important to understand customers’ needs in this strong market of telecom industries, especially for those who are looking to turn over their service providers. So, predictive churn is now a mandatory requirement for retaining those customers. Machine learning can be utilized to accomplish this. Churn Prediction has become a very important topic in terms of machine learning classification in the telecommunications industry. Understanding the factors of customer churn and how they behave is very important to building an effective churn prediction model. This paper aims to predict churn and identify factors of customers’ churn based on their past service usage history. Aiming at this objective, the study makes use of feature selection, normalization, and feature engineering. Then, this study compared the performance of four different machine learning algorithms on the Orange dataset: Logistic Regression, Random Forest, Decision Tree, and Gradient Boosting. Evaluation of the performance was conducted by using the F1 score and ROC-AUC. Comparing the results of this study with existing models has proven to produce better results. The results showed the Gradients Boosting with feature selection technique outperformed in this study by achieving a 99% F1-score and 99% AUC, and all other experiments achieved good results as well.

Keywords: machine learning, gradient boosting, logistic regression, churn, random forest, decision tree, ROC, AUC, F1-score

Procedia PDF Downloads 134
4161 Weakly Solving Kalah Game Using Artificial Intelligence and Game Theory

Authors: Hiba El Assibi

Abstract:

This study aims to weakly solve Kalah, a two-player board game, by developing a start-to-finish winning strategy using an optimized Minimax algorithm with Alpha-Beta Pruning. In weakly solving Kalah, our focus is on creating an optimal strategy from the game's beginning rather than analyzing every possible position. The project will explore additional enhancements like symmetry checking and code optimizations to speed up the decision-making process. This approach is expected to give insights into efficient strategy formulation in board games and potentially help create games with a fair distribution of outcomes. Furthermore, this research provides a unique perspective on human versus Artificial Intelligence decision-making in strategic games. By comparing the AI-generated optimal moves with human choices, we can explore how seemingly advantageous moves can, in the long run, be harmful, thereby offering a deeper understanding of strategic thinking and foresight in games. Moreover, this paper discusses the evaluation of our strategy against existing methods, providing insights on performance and computational efficiency. We also discuss the scalability of our approach to the game, considering different board sizes (number of pits and stones) and rules (different variations) and studying how that affects performance and complexity. The findings have potential implications for the development of AI applications in strategic game planning, enhancing our understanding of human cognitive processes in game settings, and offer insights into creating balanced and engaging game experiences.

Keywords: minimax, alpha beta pruning, transposition tables, weakly solving, game theory

Procedia PDF Downloads 55
4160 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms

Authors: Bliss Singhal

Abstract:

Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.

Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression

Procedia PDF Downloads 82
4159 Cyclostationary Analysis of Polytime Coded Signals for LPI Radars

Authors: Metuku Shyamsunder, Kakarla Subbarao, P. Prasanna

Abstract:

In radars, an electromagnetic waveform is sent, and an echo of the same signal is received by the receiver. From this received signal, by extracting various parameters such as round trip delay, Doppler frequency it is possible to find distance, speed, altitude, etc. However, nowadays as the technology increases, intruders are intercepting transmitted signal as it reaches them, and they will be extracting the characteristics and trying to modify them. So there is a need to develop a system whose signal cannot be identified by no cooperative intercept receivers. That is why LPI radars came into existence. In this paper, a brief discussion on LPI radar and its modulation (polytime code (PT1)), detection (cyclostationary (DFSM & FAM) techniques such as DFSM, FAM are presented and compared with respect to computational complexity.

Keywords: LPI radar, polytime codes, cyclostationary DFSM, FAM

Procedia PDF Downloads 476
4158 Quick off the Mark with Achilles Tendon Rupture

Authors: Emily Moore, Andrew Gaukroger, Matthew Solan, Lucy Bailey, Alexandra Boxall, Andrew Carne, Chintu Gadamsetty, Charlotte Morley, Katy Western, Iwona Kolodziejczyk

Abstract:

Introduction: Rupture of the Achilles tendon is common and has a long recovery period. Most cases are managed non-operatively. Foot and Ankle Surgeons advise an ultrasound scan to check the gap between the torn ends. A large gap (with the ankle in equinus) is a relative indication for surgery. The definitive decision regarding surgical versus non-operative management can only be made once an ultrasound scan is undertaken and the patient is subsequently reviewed by a Foot and Ankle surgeon. To get to this point, the patient journey involves several hospital departments. In nearby trusts, patients reattend for a scan and go to the plaster room both before and after the ultrasound for removal and re-application of the cast. At a third visit to the hospital, the surgeon and patient discuss options for definitive treatment. It may take 2-3 weeks from the initial Emergency Department visit before the final treatment decision is made. This “wasted time” is ultimately added to the recovery period for the patient. In this hospital, Achilles rupture patients are seen in a weekly multidisciplinary OneStop Heel Pain clinic. This pathway was already efficient but subject to occasional frustrating delays if a key staff member was absent. A new pathway was introduced with the goal to reduce delays to a definitive treatment plan. Method: A retrospective series of Achilles tendon ruptures managed according to the 2019 protocol was identified. Time taken from the Emergency Department to have both an ultrasound scan and specialist Foot and Ankle surgical review were calculated. 30 consecutive patients were treated with our new pathway and prospectively followed. The time taken for a scan and for specialist review were compared to the 30 consecutive cases from the 2019 (pre-COVID) cohort. The new pathway includes 1. A new contoured splint applied to the front of the injured limb held with a bandage. This can be removed and replaced (unlike a plaster cast) in the ultrasound department, removing the need for plaster room visits. 2. Urgent triage to a Foot and Ankle specialist. 3. Ultrasound scan for assessment of rupture gap and deep vein thrombosis check. 4. Early decision regarding surgery. Transfer to weight bearing in a prosthetic boot in equinuswithout waiting for the once-a-week clinic. 5. Extended oral VTE prophylaxis. Results: The time taken for a patient to have both an ultrasound scan and specialist review fell > 50%. All patients in the new pathway reached a definitive treatment decision within one week. There were no significant differences in patient demographics or rates of surgical vs non-operative treatment. The mean time from Emergency Department visit to specialist review and ultrasound scan fell from 8.7 days (old protocol) to 2.9 days (new pathway). The maximum time for this fell from 23 days (old protocol) to 6 days (new pathway). Conclusion: Teamwork and innovation have improved the experience for patients with an Achilles tendon rupture. The new pathway brings many advantages - reduced time in the Emergency Department, fewer hospital visits, less time using crutches and reduced overall recovery time.

Keywords: orthopaedics, achilles rupture, ultrasound, innovation

Procedia PDF Downloads 123
4157 Mammographic Multi-View Cancer Identification Using Siamese Neural Networks

Authors: Alisher Ibragimov, Sofya Senotrusova, Aleksandra Beliaeva, Egor Ushakov, Yuri Markin

Abstract:

Mammography plays a critical role in screening for breast cancer in women, and artificial intelligence has enabled the automatic detection of diseases in medical images. Many of the current techniques used for mammogram analysis focus on a single view (mediolateral or craniocaudal view), while in clinical practice, radiologists consider multiple views of mammograms from both breasts to make a correct decision. Consequently, computer-aided diagnosis (CAD) systems could benefit from incorporating information gathered from multiple views. In this study, the introduce a method based on a Siamese neural network (SNN) model that simultaneously analyzes mammographic images from tri-view: bilateral and ipsilateral. In this way, when a decision is made on a single image of one breast, attention is also paid to two other images – a view of the same breast in a different projection and an image of the other breast as well. Consequently, the algorithm closely mimics the radiologist's practice of paying attention to the entire examination of a patient rather than to a single image. Additionally, to the best of our knowledge, this research represents the first experiments conducted using the recently released Vietnamese dataset of digital mammography (VinDr-Mammo). On an independent test set of images from this dataset, the best model achieved an AUC of 0.87 per image. Therefore, this suggests that there is a valuable automated second opinion in the interpretation of mammograms and breast cancer diagnosis, which in the future may help to alleviate the burden on radiologists and serve as an additional layer of verification.

Keywords: breast cancer, computer-aided diagnosis, deep learning, multi-view mammogram, siamese neural network

Procedia PDF Downloads 138
4156 On the Application of Heuristics of the Traveling Salesman Problem for the Task of Restoring the DNA Matrix

Authors: Boris Melnikov, Dmitrii Chaikovskii, Elena Melnikova

Abstract:

The traveling salesman problem (TSP) is a well-known optimization problem that seeks to find the shortest possible route that visits a set of points and returns to the starting point. In this paper, we apply some heuristics of the TSP for the task of restoring the DNA matrix. This restoration problem is often considered in biocybernetics. For it, we must recover the matrix of distances between DNA sequences if not all the elements of the matrix under consideration are known at the input. We consider the possibility of using this method in the testing of distance calculation algorithms between a pair of DNAs to restore the partially filled matrix.

Keywords: optimization problems, DNA matrix, partially filled matrix, traveling salesman problem, heuristic algorithms

Procedia PDF Downloads 150
4155 Downtime Modelling for the Post-Earthquake Building Assessment Phase

Authors: S. Khakurel, R. P. Dhakal, T. Z. Yeow

Abstract:

Downtime is one of the major sources (alongside damage and injury/death) of financial loss incurred by a structure in an earthquake. The length of downtime associated with a building after an earthquake varies depending on the time taken for the reaction (to the earthquake), decision (on the future course of action) and execution (of the decided course of action) phases. Post-earthquake assessment of buildings is a key step in the decision making process to decide the appropriate safety placarding as well as to decide whether a damaged building is to be repaired or demolished. The aim of the present study is to develop a model to quantify downtime associated with the post-earthquake building-assessment phase in terms of two parameters; i) duration of the different assessment phase; and ii) probability of different colour tagging. Post-earthquake assessment of buildings includes three stages; Level 1 Rapid Assessment including a fast external inspection shortly after the earthquake, Level 2 Rapid Assessment including a visit inside the building and Detailed Engineering Evaluation (if needed). In this study, the durations of all three assessment phases are first estimated from the total number of damaged buildings, total number of available engineers and the average time needed for assessing each building. Then, probability of different tag colours is computed from the 2010-11 Canterbury earthquake Sequence database. Finally, a downtime model for the post-earthquake building inspection phase is proposed based on the estimated phase length and probability of tag colours. This model is expected to be used for rapid estimation of seismic downtime within the Loss Optimisation Seismic Design (LOSD) framework.

Keywords: assessment, downtime, LOSD, Loss Optimisation Seismic Design, phase length, tag color

Procedia PDF Downloads 185
4154 Shared Decision-Making in Holistic Healthcare: Integrating Evidence-Based Medicine and Values-Based Medicine

Authors: Ling-Lang Huang

Abstract:

Research Background: Historically, the evolution of medicine has not only aimed to extend life but has also inadvertently introduced suffering in the process of maintaining life, presenting a contemporary challenge. We must carefully assess the conflict between the length of life and the quality of living. Evidence-Based Medicine (EBM) exists primarily to ensure the quality of cures. However, EBM alone does not fulfill our ultimate medical goals; we must also evaluate Value-Based Medicine (VBM) to find the best treatment for patients. Research Methodology: We can attempt to integrate EBM with VBM. Within the five steps of EBM, the first three steps (Ask—Acquire—Appraise) focus on the physical aspect of humans. However, in the fourth and fifth steps (Apply—Assess), the focus shifts from the physical to applying evidence-based treatment to the patient and assessing its effectiveness, considering a holistic approach to the individual. To consider VBM for patients, we can divide the process into three steps: The first step is "awareness," recognizing that each patient inhabits a different life-world and possesses unique differences. The second step is "integration," akin to the hermeneutic concept of the Fusion of Horizons. This means being aware of differences and also understanding the origins of these patient differences. The third step is "respect," which involves setting aside our adherence to medical objectivity and scientific rigor to respect the ultimate healthcare decisions made by individuals regarding their lives. Discussion and Conclusion: After completing these three steps of VBM, we can return to the fifth step of EBM: Assess. Our assessment can now transcend the physical treatment focus of the initial steps to align with a holistic care philosophy.

Keywords: shared decision-making, evidence-based medicine, values-based medicine, holistic healthcare

Procedia PDF Downloads 52
4153 Comparison of Various Policies under Different Maintenance Strategies on a Multi-Component System

Authors: Demet Ozgur-Unluakin, Busenur Turkali, Ayse Karacaorenli

Abstract:

Maintenance strategies can be classified into two types, which are reactive and proactive, with respect to the time of the failure and maintenance. If the maintenance activity is done after a breakdown, it is called reactive maintenance. On the other hand, proactive maintenance, which is further divided as preventive and predictive, focuses on maintaining components before a failure occurs to prevent expensive halts. Recently, the number of interacting components in a system has increased rapidly and therefore, the structure of the systems have become more complex. This situation has made it difficult to provide the right maintenance decisions. Herewith, determining effective decisions has played a significant role. In multi-component systems, many methodologies and strategies can be applied when a component or a system has already broken down or when it is desired to identify and avoid proactively defects that could lead to future failure. This study focuses on the comparison of various maintenance strategies on a multi-component dynamic system. Components in the system are hidden, although there exists partial observability to the decision maker and they deteriorate in time. Several predefined policies under corrective, preventive and predictive maintenance strategies are considered to minimize the total maintenance cost in a planning horizon. The policies are simulated via Dynamic Bayesian Networks on a multi-component system with different policy parameters and cost scenarios, and their performances are evaluated. Results show that when the difference between the corrective and proactive maintenance cost is low, none of the proactive maintenance policies is significantly better than the corrective maintenance. However, when the difference is increased, at least one policy parameter for each proactive maintenance strategy gives significantly lower cost than the corrective maintenance.

Keywords: decision making, dynamic Bayesian networks, maintenance, multi-component systems, reliability

Procedia PDF Downloads 129
4152 In Search of a Safe Haven-Sexual Violence Leading to a Change of Sexual Orientation

Authors: Medagedara Kaushalya Sewwandi Supun Gunarathne

Abstract:

This research explores the underlying motivations and consequences of individuals changing their sexual orientation as a response to sexual violence. The primary objective of the study is to unravel the psychological, emotional, and social factors that drive individuals, akin to Celie in Alice Walker’s ‘The Color Purple’, to contemplate and undergo changes in their sexual orientation following the trauma of sexual violence. Through an analytical and qualitative approach, the study employs in-depth textual and thematic analyses to scrutinize the complex interplay between sexual orientation and violence within the selected text. Through a close examination of Celie’s journey and experiences, the study reveals that her decision to switch sexual orientation arises from a desire for a more favorable and benevolent relationship driven by the absence of safety and refuge in her previous relationships. By establishing this bond between sexual orientation and violence, the research underscores how sexual violence can lead individuals to opt for a change in their sexual orientation. The findings highlight Celie’s transformation as a means to seek solace and security, thus concluding that sexual violence can prompt individuals to alter their sexual orientation. The ensuing discussion explores the implications of these findings, encompassing psychological, emotional, and social consequences, as well as the societal and cultural factors influencing the perception of sexual orientation. Additionally, it sheds light on the challenges and stigma faced by those who undergo such transformations. By comprehending the complex relationship between sexual violence and the decision to change sexual orientation, as exemplified by Celie in ‘The Color Purple’, a deeper understanding of the experiences of survivors who seek a safe haven through altering their sexual orientation can be attained.

Keywords: sexual violence, sexual orientation, refuge, transition

Procedia PDF Downloads 79
4151 Development of an Instrument for Measurement of Thermal Conductivity and Thermal Diffusivity of Tropical Fruit Juice

Authors: T. Ewetumo, K. D. Adedayo, Festus Ben

Abstract:

Knowledge of the thermal properties of foods is of fundamental importance in the food industry to establish the design of processing equipment. However, for tropical fruit juice, there is very little information in literature, seriously hampering processing procedures. This research work describes the development of an instrument for automated thermal conductivity and thermal diffusivity measurement of tropical fruit juice using a transient thermal probe technique based on line heat principle. The system consists of two thermocouple sensors, constant current source, heater, thermocouple amplifier, microcontroller, microSD card shield and intelligent liquid crystal. A fixed distance of 6.50mm was maintained between the two probes. When heat is applied, the temperature rise at the heater probe measured with time at time interval of 4s for 240s. The measuring element conforms as closely as possible to an infinite line source of heat in an infinite fluid. Under these conditions, thermal conductivity and thermal diffusivity are simultaneously measured, with thermal conductivity determined from the slope of a plot of the temperature rise of the heating element against the logarithm of time while thermal diffusivity was determined from the time it took the sample to attain a peak temperature and the time duration over a fixed diffusivity distance. A constant current source was designed to apply a power input of 16.33W/m to the probe throughout the experiment. The thermal probe was interfaced with a digital display and data logger by using an application program written in C++. Calibration of the instrument was done by determining the thermal properties of distilled water. Error due to convection was avoided by adding 1.5% agar to the water. The instrument has been used for measurement of thermal properties of banana, orange and watermelon. Thermal conductivity values of 0.593, 0.598, 0.586 W/m^o C and thermal diffusivity values of 1.053 ×〖10〗^(-7), 1.086 ×〖10〗^(-7), and 0.959 ×〖10〗^(-7) 〖m/s〗^2 were obtained for banana, orange and water melon respectively. Measured values were stored in a microSD card. The instrument performed very well as it measured the thermal conductivity and thermal diffusivity of the tropical fruit juice samples with statistical analysis (ANOVA) showing no significant difference (p>0.05) between the literature standards and estimated averages of each sample investigated with the developed instrument.

Keywords: thermal conductivity, thermal diffusivity, tropical fruit juice, diffusion equation

Procedia PDF Downloads 357
4150 Building Information Modelling Based Value for Money Assessment in Public-Private Partnership

Authors: Guoqian Ren, Haijiang Li, Jisong Zhang

Abstract:

Over the past 40 years, urban development has undergone large-scale, high-speed expansion, beyond what was previously considered normal and in a manner not proportionally related to population growth or physical considerations. With more scientific and refined decision-making in the urban construction process, new urbanization approaches, aligned with public-private partnerships (PPPs) which evolved in the early 1990s, have become acceptable and, in some situations, even better solutions to outstanding urban municipal construction projects, especially in developing countries. However, as the main driving force to deal with urban public services, PPPs are still problematic regarding value for money (VFM) process in most large-scale construction projects. This paper therefore reviews recent PPP articles in popular project management journals and relevant toolkits, published in the last 10 years, to identify the indicators that influence VFM within PPPs across regions. With increasing concerns about profitability and environmental and social impacts, the current PPP structure requires a more integrated platform to manage multi-performance project life cycles. Building information modelling (BIM), a popular approach to the procurement process in AEC sectors, provides the potential to ensure VFM while also working in tandem with the semantic approach to holistically measure life cycle costs (LCC) and achieve better sustainability. This paper suggests that BIM applied to the entire PPP life cycle could support holistic decision-making regarding VFM processes and thus meet service targets.

Keywords: public-private partnership, value for money, building information modelling, semantic approach

Procedia PDF Downloads 209
4149 A Tool for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: digital information management, file format, endangerment analysis, fuzzy models

Procedia PDF Downloads 404
4148 Quality and Quality Assurance in Education: Examining the Possible Relationship

Authors: Rodoula Stavroula Gkarnara, Nikolaos Andreadakis

Abstract:

The purpose of this paper is to examine the relationship between quality and quality assurance in education. It constitutes a critical review of the bibliography regarding quality and its delimitation in the field of education, as well as the quality assurance in education and the approaches identified for its extensive study. The two prevailing and opposite views on the correlation of the two concepts are that on the one hand there is an inherent distance between these concepts as they are two separate terms and on the other hand they are interrelated and interdependent concepts that contribute to the improvement of quality in education. Finally, the last part of the paper, adopting the second view, refers to the contribution of quality assurance to quality, where it is pointed out that the first concept leads to the improvement of the latter by quality assurance being the means of feedback for the quality achieved.

Keywords: education, quality, quality assurance, quality improvement

Procedia PDF Downloads 217
4147 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test

Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea

Abstract:

In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.

Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence

Procedia PDF Downloads 97
4146 Natural Convection between Two Parallel Wavy Plates

Authors: Si Abdallah Mayouf

Abstract:

In this work, the effects of the wavy surface on free convection heat transfer boundary layer flow between two parallel wavy plates have been studied numerically. The two plates are considered at a constant temperature. The equations and the boundary conditions are discretized by the finite difference scheme and solved numerically using the Gauss-Seidel algorithm. The important parameters in this problem are the amplitude of the wavy surfaces and the distance between the two wavy plates. Results are presented as velocity profiles, temperature profiles and local Nusselt number according to the important parameters.

Keywords: free convection, wavy surface, parallel plates, fluid dynamics

Procedia PDF Downloads 307
4145 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty

Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos

Abstract:

Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.

Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning

Procedia PDF Downloads 210
4144 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.

Keywords: big data, evolutionary computing, cloud, precision technologies

Procedia PDF Downloads 189
4143 Impact of Human Resources Accounting on Employees' Performance in Organization

Authors: Hamid Saremi, Shida Hanafi

Abstract:

In an age of technology and economics, human capital has important and axial role in the organization and human resource accounting has a wide perception to key resources of organization i.e. human resources. Human resources accounting is new branch of accounting that has Short-lived and generally deals to a range of policies and measures that are related to various aspects of human resources and It gives importance to an organization's most important asset is its human resources and human resource management is the key to success in an organization and to achieve this important matter must review and evaluation of human resources data be with knowledge of accounting based on empirical studies and methods of measurement and reporting of human resources accounting information. Undoubtedly human resource management without information cannot be done and take decision and human resources accounting is practical way to inform the decision makers who are committed to harnessing human resources,, human resources accounting with applying accounting principles in the organization and is with conducting basic research on the extent of the of human resources accounting information" effect of employees' personal performance. In human resource accounting analysis and criteria and valuation of cost and manpower valuating is as the main resource in each Institute. Protection of human resources is a process that according to human resources accounting is for organization profitability. In fact, this type of accounting can be called as a major source in measurement and trends of costs and human resources valuation in each institution. What is the economic value of such assets? What is the amount of expenditures for education and training of professional individuals to value in asset account? What amount of funds spent should be considered as lost opportunity cost? In this paper, according to the literature of human resource accounting we have studied the human resources matter and its objectives and topic of the importance of human resource valuation on employee performance review and method of reporting of human resources according to different models.

Keywords: human resources, human resources, accounting, human capital, human resource management, valuation and cost of human resources, employees, performance, organization

Procedia PDF Downloads 548
4142 Open Distance Learning and Curriculum Transformation: Linkages, Alignment, and Innovation

Authors: Devanandan Govender

Abstract:

Curriculum design and development in higher education is a complex and challenging process. Amongst others, the extent to which higher education curriculum responds to a country's imperatives, industry requirements, and societal demands are some important considerations. Added to this is the whole notion of sustainable development, climate change and in the South African context the issue of ‘Africanising the curriculum’ is also significant. In this paper, the author describes and analyses the various challenges related to curriculum transformation, design and development within an ODL context and how we at Unisa engage and address curriculum transformation in mainstream curriculum design and development both at course design level and programme/ qualification level.

Keywords: curriculum transformation, curriculum creep, curriculum drift, curriculum mapping

Procedia PDF Downloads 377
4141 The Optimization of TICSI in the Convergence Mechanism of Urban Water Management

Authors: M. Macchiaroli, L. Dolores, V. Pellecchia

Abstract:

With the recent Resolution n. 580/2019/R/idr, the Italian Regulatory Authority for Energy, Networks, and Environment (ARERA) for the Urban Water Management has introduced, for water managements characterized by persistent critical issues regarding the planning and organization of the service and the implementation of the necessary interventions for the improvement of infrastructures and management quality, a new mechanism for determining tariffs: the regulatory scheme of Convergence. The aim of this regulatory scheme is the overcoming of the Water Service Divided in order to improve the stability of the local institutional structures, technical quality, contractual quality, as well as in order to guarantee transparency elements for Users of the Service. Convergence scheme presupposes the identification of the cost items to be considered in the tariff in parametric terms, distinguishing three possible cases according to the type of historical data available to the Manager. The study, in particular, focuses on operations that have neither data on tariff revenues nor data on operating costs. In this case, the Manager's Constraint on Revenues (VRG) is estimated on the basis of a reference benchmark and becomes the starting point for defining the structure of the tariff classes, in compliance with the TICSI provisions (Integrated Text for tariff classes, ARERA's Resolution n. 665/2017/R/idr). The proposed model implements the recent studies on optimization models for the definition of tariff classes in compliance with the constraints dictated by TICSI in the application of the Convergence mechanism, proposing itself as a support tool for the Managers and the local water regulatory Authority in the decision-making process.

Keywords: decision-making process, economic evaluation of projects, optimizing tools, urban water management, water tariff

Procedia PDF Downloads 119
4140 Inter-Complex Dependence of Production Technique and Preforms Construction on the Failure Pattern of Multilayer Homo-Polymer Composites

Authors: Ashraf Nawaz Khan, R. Alagirusamy, Apurba Das, Puneet Mahajan

Abstract:

The thermoplastic-based fibre composites are acquiring a market sector of conventional as well as thermoset composites. However, replacing the thermoset with a thermoplastic composite has never been an easy task. The inherent high viscosity of thermoplastic resin reveals poor interface properties. In this work, a homo-polymer towpreg is produced through an electrostatic powder spray coating methodology. The produced flexible towpreg offers a low melt-flow distance during the consolidation of the laminate. The reduced melt-flow distance demonstrates a homogeneous fibre/matrix distribution (and low void content) on consolidation. The composite laminate has been fabricated with two manufacturing techniques such as conventional film stack (FS) and powder-coated (PC) technique. This helps in understanding the distinct response of produced laminates on applying load since the laminates produced through the two techniques are comprised of the same constituent fibre and matrix (constant fibre volume fraction). The changed behaviour is observed mainly due to the different fibre/matrix configurations within the laminate. The interface adhesion influences the load transfer between the fibre and matrix. Therefore, it influences the elastic, plastic, and failure patterns of the laminates. Moreover, the effect of preform geometries (plain weave and satin weave structure) are also studied for corresponding composite laminates in terms of various mechanical properties. The fracture analysis is carried out to study the effect of resin at the interlacement points through micro-CT analysis. The PC laminate reveals a considerably small matrix-rich and deficient zone in comparison to the FS laminate. The different load tensile, shear, fracture toughness, and drop weight impact test) is applied to the laminates, and corresponding damage behaviour is analysed in the successive stage of failure. The PC composite has shown superior mechanical properties in comparison to the FS composite. The damage that occurs in the laminate is captured through the SEM analysis to identify the prominent mode of failure, such as matrix cracking, fibre breakage, delamination, debonding, and other phenomena.

Keywords: composite, damage, fibre, manufacturing

Procedia PDF Downloads 137
4139 A Literature Review on the Effect of Financial Knowledge toward Corporate Growth: The Important Role of Financial Risk Attitude

Authors: Risna Wijayanti, Sumiati, Hanif Iswari

Abstract:

This study aims to analyze the role of financial risk attitude as a mediation between financial knowledge and business growth. The ability of human resources in managing capital (financial literacy) can be a major milestone for a company's business to grow and build its competitive advantage. This study analyzed the important role of financial risk attitude in bringing about financial knowledge on corporate growth. There have been many discussions arguing that financial knowledge is one of the main abilities of corporate managers in determining the success of managing a company. However, a contrary argument of other scholars also enlightened that financial knowledge did not have a significant influence on corporate growth. This study used literatures' review to analyze whether there is another variable that can mediate the effect of financial knowledge toward corporate growth. Research mapping was conducted to analyze the concept of risk tolerance. This concept was related to people's risk aversion effects when making a decision under risk and the role of financial knowledge on changes in financial income. Understanding and managing risks and investments are complicated, in particular for corporate managers, who are always demanded to maintain their corporate growth. Substantial financial knowledge is extremely needed to identify and take accurate information for corporate financial decision-making. By reviewing several literature, this study hypothesized that financial knowledge of corporate managers would be meaningless without manager's courage to bear risks for taking favorable business opportunities. Therefore, the level of risk aversion from corporate managers will determine corporate action, which is a reflection of corporate-level investment behavior leading to attain corporate success or failure for achieving the company's expected growth rate.

Keywords: financial knowledge, financial risk attitude, corporate growth, risk tolerance

Procedia PDF Downloads 129
4138 Jet Impingement Heat Transfer on a Rib-Roughened Flat Plate

Authors: A. H. Alenezi

Abstract:

Cooling by impingement jet is known to have a significant high local and average heat transfer coefficient which make it widely used in industrial cooling systems. The heat transfer characteristics of an impinging jet on rib-roughened flat plate has been investigated numerically. This paper was set out to investigate the effect of rib height on the heat transfer rate. Since the flow needs to have enough spacing after passing the rib to allow reattachment especially for high Reynolds numbers, this study focuses on finding the optimum rib height which would be the best to maximize the heat transfer rate downstream the plate. This investigation employs a round nozzle with hydraulic diameter (Dh) of 13.5 mm, Jet-to-target distance of (H/D) of 4, rib location=1.5D and and finally jet angels of 45˚ and 90˚ under the influence of Re =10,000.

Keywords: jet impingement, CFD, turbulence model, heat transfer

Procedia PDF Downloads 351
4137 Vulnerability of People to Climate Change: Influence of Methods and Computation Approaches on Assessment Outcomes

Authors: Adandé Belarmain Fandohan

Abstract:

Climate change has become a major concern globally, particularly in rural communities that have to find rapid coping solutions. Several vulnerability assessment approaches have been developed in the last decades. This comes along with a higher risk for different methods to result in different conclusions, thereby making comparisons difficult and decision-making non-consistent across areas. The effect of methods and computational approaches on estimates of people’s vulnerability was assessed using data collected from the Gambia. Twenty-four indicators reflecting vulnerability components: (exposure, sensitivity, and adaptive capacity) were selected for this purpose. Data were collected through household surveys and key informant interviews. One hundred and fifteen respondents were surveyed across six communities and two administrative districts. Results were compared over three computational approaches: the maximum value transformation normalization, the z-score transformation normalization, and simple averaging. Regardless of the approaches used, communities that have high exposure to climate change and extreme events were the most vulnerable. Furthermore, the vulnerability was strongly related to the socio-economic characteristics of farmers. The survey evidenced variability in vulnerability among communities and administrative districts. Comparing output across approaches, overall, people in the study area were found to be highly vulnerable using the simple average and maximum value transformation, whereas they were only moderately vulnerable using the z-score transformation approach. It is suggested that assessment approach-induced discrepancies be accounted for in international debates to harmonize/standardize assessment approaches to the end of making outputs comparable across regions. This will also likely increase the relevance of decision-making for adaptation policies.

Keywords: maximum value transformation, simple averaging, vulnerability assessment, West Africa, z-score transformation

Procedia PDF Downloads 103
4136 Complex Decision Rules in Quality Assurance Processes for Quick Service Restaurant Industry: Human Factors Determining Acceptability

Authors: Brandon Takahashi, Marielle Hanley, Gerry Hanley

Abstract:

The large-scale quick-service restaurant industry is a complex business to manage optimally. With over 40 suppliers providing different ingredients for food preparation and thousands of restaurants serving over 50 unique food offerings across a wide range of regions, the company must implement a quality assurance process. Businesses want to deliver quality food efficiently, reliably, and successfully at a low cost that the public wants to buy. They also want to make sure that their food offerings are never unsafe to eat or of poor quality. A good reputation (and profitable business) developed over the years can be gone in an instant if customers fall ill eating your food. Poor quality also results in food waste, and the cost of corrective actions is compounded by the reduction in revenue. Product compliance evaluation assesses if the supplier’s ingredients are within compliance with the specifications of several attributes (physical, chemical, organoleptic) that a company will test to ensure that a quality, safe to eat food is given to the consumer and will deliver the same eating experience in all parts of the country. The technical component of the evaluation includes the chemical and physical tests that produce numerical results that relate to shelf-life, food safety, and organoleptic qualities. The psychological component of the evaluation includes organoleptic, which is acting on or involving the use of the sense organs. The rubric for product compliance evaluation has four levels: (1) Ideal: Meeting or exceeding all technical (physical and chemical), organoleptic, & psychological specifications. (2) Deviation from ideal but no impact on quality: Not meeting or exceeding some technical and organoleptic/psychological specifications without impact on consumer quality and meeting all food safety requirements (3) Acceptable: Not meeting or exceeding some technical and organoleptic/psychological specifications resulting in reduction of consumer quality but not enough to lessen demand and meeting all food safety requirements (4) Unacceptable: Not meeting food safety requirements, independent of meeting technical and organoleptic specifications or meeting all food safety requirements but product quality results in consumer rejection of food offering. Sampling of products and consumer tastings within the distribution network is a second critical element of the quality assurance process and are the data sources for the statistical analyses. Each finding is not independently assessed with the rubric. For example, the chemical data will be used to back up/support any inferences on the sensory profiles of the ingredients. Certain flavor profiles may not be as apparent when mixed with other ingredients, which leads to weighing specifications differentially in the acceptability decision. Quality assurance processes are essential to achieve that balance of quality and profitability by making sure the food is safe and tastes good but identifying and remediating product quality issues before they hit the stores. Comprehensive quality assurance procedures implement human factors methodologies, and this report provides recommendations for systemic application of quality assurance processes for quick service restaurant services. This case study will review the complex decision rubric and evaluate processes to ensure the right balance of cost, quality, and safety is achieved.

Keywords: decision making, food safety, organoleptics, product compliance, quality assurance

Procedia PDF Downloads 188
4135 Screening Maize for Compatibility with F. Oxysporum to Enhance Striga asiatica (L.) Kuntze Resistance

Authors: Admire Isaac Tichafa Shayanowako, Mark Laing, Hussein Shimelis

Abstract:

Striga asiatica is among the leading abiotic constraints to maize production under small-holder farming communities in southern African. However, confirmed sources of resistance to the parasitic weed are still limited. Conventional breeding programmes have been progressing slowly due to the complex nature of the inheritance of Striga resistance, hence there is a need for more innovative approaches. This study aimed to achieve partial resistance as well as to breed for compatibility with Fusarium oxysporum fsp strigae, a soil fungus that is highly specific in its pathogenicity. The agar gel and paper roll assays in conjunction with a glass house pot trial were done to select genotypes based on their potential to stimulate germination of Striga and to test the efficacy of Fusarium oxysporum as a biocontrol agent. Results from agar gel assays showed a moderate to high potential in the release of Strigalactones among the 33 OPVs. Maximum Striga germination distances from the host root of 1.38 cm and up to 46% germination were observed in most of the populations. Considerable resistance was observed in a landrace ‘8lines’ which had the least Striga germination percentage (19%) with a maximum distance of 0.93 cm compared to the resistant check Z-DPLO-DTC1 that had 23% germination at a distance of 1.4cm. The number of fusarium colony forming units significantly deferred (P < 0.05) amongst the genotypes growing between germination papers. The number of crown roots, length of primary root and fresh weight of shoot and roots were highly correlated with concentration of fusarium macrospore counts. Pot trials showed significant differences between the fusarium coated and the uncoated treatments in terms of plant height, leaf counts, anthesis-silks intervals, Striga counts, Striga damage rating and Striga vigour. Striga emergence counts and Striga flowers were low in fusarium treated pots. Plants in fusarium treated pots had non-significant differences in height with the control treatment. This suggests that foxy 2 reduces the impact of Striga damage severity. Variability within fusarium treated genotypes with respect to traits under evaluation indicates the varying degree of compatibility with the biocontrol.

Keywords: maize, Striga asiaitca, resistance, compatibility, F. oxysporum

Procedia PDF Downloads 250