Search results for: cloud service models
7935 Performance Management in Serbian Banks: Balanced Scorecard Approach
Authors: Nela Milosevic, Sladjana Barjaktarovic Rakocevic, Sladjana Benkovic, Nemanja Milanovic
Abstract:
Nowadays, performance measurement systems play a key role in evaluating the strategic performances of an organization. On the other hand, there has been a shift towards the Balanced Scorecard (BSC), which has been recognized as a valuable managerial approach. The main goal of this paper is to analyze the main performances of Serbian banks measured at the branches level, through the usage of the Balanced Scorecard framework. Although an extensive number of practitioners have an interest in the Balanced Scorecard approach, little empirical research has been conducted on the implementation of its concept in the service sector like banks, especially within developing countries. From the beginning of August till the end of September 2015, authors have been conducting in-depth interviews among a number of experts from the most successful banks in Serbia. The results show that the non-financial measures, especially, customer oriented indicators and product/ service oriented indicators, seem to be very important factors for improving not only the financial situation within the bank, but also overall business performances. Additionally, the findings prove that there is the cause-effect relationship between non-financial and financial dimensions of the Balanced Scorecard. Having in mind that the banks are still using outdated performance evaluation systems, such as annual, quarterly and monthly reports, we hope that this paper will contribute to the knowledge of how banks in Serbia may apply the Balanced Scorecard approach to evaluate their performance on the most efficient and effective way.Keywords: balanced scorecard approach, bank management, performance measurement systems, strategic performances
Procedia PDF Downloads 3417934 Performance of Fiber-Reinforced Polymer as an Alternative Reinforcement
Authors: Salah E. El-Metwally, Marwan Abdo, Basem Abdel Wahed
Abstract:
Fiber-reinforced polymer (FRP) bars have been proposed as an alternative to conventional steel bars; hence, the use of these non-corrosive and nonmetallic reinforcing bars has increased in various concrete projects. This concrete material is lightweight, has a long lifespan, and needs minor maintenance; however, its non-ductile nature and weak bond with the surrounding concrete create a significant challenge. The behavior of concrete elements reinforced with FRP bars has been the subject of several experimental investigations, even with their high cost. This study aims to numerically assess the viability of using FRP bars, as longitudinal reinforcement, in comparison with traditional steel bars, and also as prestressing tendons instead of the traditional prestressing steel. The nonlinear finite element analysis has been utilized to carry out the current study. Numerical models have been developed to examine the behavior of concrete beams reinforced with FRP bars or tendons against similar models reinforced with either conventional steel or prestressing steel. These numerical models were verified by experimental test results available in the literature. The obtained results revealed that concrete beams reinforced with FRP bars, as passive reinforcement, exhibited less ductility and less stiffness than similar beams reinforced with steel bars. On the other hand, when FRP tendons are employed in prestressing concrete beams, the results show that the performance of these beams is similar to those beams prestressed by conventional active reinforcement but with a difference caused by the two tendon materials’ moduli of elasticity.Keywords: reinforced concrete, prestressed concrete, nonlinear finite element analysis, fiber-reinforced polymer, ductility
Procedia PDF Downloads 197933 Strengthening Functional Community-Provider Linkages: Lessons from the Challenge Initiative for Healthy Cities Program in Indore, India
Authors: Sabyasachi Behera, Shiv Kumar, Pramod Gautam, Anisur Rahman, Pawan Pathak, Rahul Bhadouria
Abstract:
Background: The increasing proportion of population especially urban poor and vulnerable groups or groups with specific needs, with health indicators worse than their rural counterparts in India face various issues related with availability and quality of health care. The reasons are myriad, starting from information and awareness of the community, especially, in a scenario wherein the needs and challenges of floating and migrant urban populations remain poorly understood. Weak linkages between health care facilities and slum dwellers and vulnerable populations hinder the improvement of health services for urban poor. Method: To address this issue, TCIHC program is helping health department of Indore city of Madhya Pradesh to establish a referral mechanism with a dual approach: at both community and facility level. The former is based on the premise of ‘building social capital’, i.e. norms and networks within a community facilitating collective action, helps improve the demand and supply of health services at appropriate levels of care (Minus 2: Accredited Social Health Activist and Community Health Groups; Minus 1: Urban Health Nutrition Days; Zero: Urban Primary Health Center; Plus 1: secondary facility with BEmONC services; Plus 2: secondary facilities with CEmONC services; Plus 3: tertiary level facility) for the urban poor. The latter focuses on encouraging the provision of all services at various levels of service delivery points and stakeholders to function in a coordinated manner to ensure better health service availability and coverage in underserved slum areas. Results: This initiative has enhanced the utilization of community based, primary and secondary level services through defined referral pathways that are clearly known to a community dweller. Conclusion: An ideal referral mechanism should begin with referral at the community level wherein services of a frontline health care provider are accessed by them at their door-step, causing no delay in both understanding and decision on the health issues faced by them.Keywords: levels of care, linkages, referral mechanism, service delivery
Procedia PDF Downloads 1467932 Facilitated Massive Open Online Course (MOOC) Based Teacher Professional Development in Kazakhstan: Connectivism-Oriented Practices
Authors: A. Kalizhanova, T. Shelestova
Abstract:
Teacher professional development (TPD) in Kazakhstan has followed a fairly standard format for centuries, with teachers learning new information from a lecturer and being tested using multiple-choice questions. In the online world, self-access courses have become increasingly popular. Due to their extensive multimedia content, peer-reviewed assignments, adaptable class times, and instruction from top university faculty from across the world, massive open online courses (MOOCs) have found a home in Kazakhstan's system for lifelong learning. Recent studies indicate the limited use of connectivism-based tools such as discussion forums by Kazakhstani pre-service and in-service English teachers, whose professional interests are limited to obtaining certificates rather than enhancing their teaching abilities and exchanging knowledge with colleagues. This paper highlights the significance of connectivism-based tools and instruments, such as MOOCs, for the continuous professional development of pre- and in-service English teachers, facilitators' roles, and their strategies for enhancing trainees' conceptual knowledge within the MOOCs' curriculum and online learning skills. Reviewing the most pertinent papers on Connectivism Theory, facilitators' function in TPD, and connectivism-based tools, such as MOOCs, a code extraction method was utilized. Three experts, former active participants in a series of projects initiated across Kazakhstan to improve the efficacy of MOOCs, evaluated the excerpts and selected the most appropriate ones to propose the matrix of teacher professional competencies that can be acquired through MOOCs. In this paper, we'll look at some of the strategies employed by course instructors to boost their students' English skills and knowledge of course material, both inside and outside of the MOOC platform. Participants' interactive learning contributed to their language and subject conceptual knowledge and prepared them for peer-reviewed assignments in the MOOCs, and this approach of small group interaction was given to highlight the outcomes of participants' interactive learning. Both formal and informal continuing education institutions can use the findings of this study to support teachers in gaining experience with MOOCs and creating their own online courses.Keywords: connectivism-based tools, teacher professional development, massive open online courses, facilitators, Kazakhstani context
Procedia PDF Downloads 837931 Annual Water Level Simulation Using Support Vector Machine
Authors: Maryam Khalilzadeh Poshtegal, Seyed Ahmad Mirbagheri, Mojtaba Noury
Abstract:
In this paper, by application of the input yearly data of rainfall, temperature and flow to the Urmia Lake, the simulation of water level fluctuation were applied by means of three models. According to the climate change investigation the fluctuation of lakes water level are of high interest. This study investigate data-driven models, support vector machines (SVM), SVM method which is a new regression procedure in water resources are applied to the yearly level data of Lake Urmia that is the biggest and the hyper saline lake in Iran. The evaluated lake levels are found to be in good correlation with the observed values. The results of SVM simulation show better accuracy and implementation. The mean square errors, mean absolute relative errors and determination coefficient statistics are used as comparison criteria.Keywords: simulation, water level fluctuation, urmia lake, support vector machine
Procedia PDF Downloads 3697930 Academic Identities in Transition
Authors: Caroline Selai, Sushrut Jadhav
Abstract:
Background: University College London (UCL), the first secular university in England to admit students regardless of their religion and gender, has nearly 29,000 students of which approximately 30% are international students. The UCL Cultural Consultation Service (CCS) for staff and students is a unique service that provides assistance to staff and students experiencing challenges in their teaching, enabling, support work or studies which they believe may have a cultural component. The service provides one-to-one and group consultations, lectures, seminars, ‘grand rounds’, interactive workshops and bespoke interventions. Data: This paper presents a content analysis of CCS referrals over the last 36 months. We focus on the experience of international students, many of whom experience not only a challenge to their academic identity but also a profound challenge to their personal cultural identity. We also present 3 vignettes to illustrate how students interpret, accept, contest and resist changes in their cultural and academic identity. Discussion: This paper highlights (i) how students from collectivist cultures attempt to assimilate within an individualistic, highly competitive western university that is bound by its own institutional norms; (ii) problems in negotiating challenges at the interface of culture and gender (iii) the impact of culturally different hierarchies of power, discrimination and authority and (iv) the significance of earlier traumatic and kinship conflicts. Many international students’ social identities are shaped by their cultural and family scripts. A large number have been taught that their teachers are to be revered and their teachings unchallenged. This is at odds with quintessential goal of the western university to encourage healthy scepticism and hone students’ critical thinking skills. Conclusions: Pupil-teacher ‘cultural transference’ and shifts in cultural academic identities of students underscore critical aspects of developmental and learning challenges for students. Staff-student cultural conflict requires a broader, systemic analysis of students, staff and the wider organisation. Our findings challenge Eurocentric psychodynamic concepts such as the nature of parent-child relationship in Western Europe. We argue for a broader, more inclusive approach to develop both effective pedagogic skills in euro-american academic institutions and culturally- appropriate psychodynamic theory to underpin counselling international students.Keywords: academic identity, cultural transference, cultural consultation in higher education, cultural formulation, cultural identity.
Procedia PDF Downloads 4617929 A Goal-Oriented Approach for Supporting Input/Output Factor Determination in the Regulation of Brazilian Electricity Transmission
Authors: Bruno de Almeida Vilela, Heinz Ahn, Ana Lúcia Miranda Lopes, Marcelo Azevedo Costa
Abstract:
Benchmarking public utilities such as transmission system operators (TSOs) is one of the main strategies employed by regulators in order to fix monopolistic companies’ revenues. Since 2007 the Brazilian regulator has been utilizing Data Envelopment Analysis (DEA) to benchmark TSOs. Despite the application of DEA to improve the transmission sector’s efficiency, some problems can be pointed out, such as the high price of electricity in Brazil; the limitation of the benchmarking only to operational expenses (OPEX); the absence of variables that represent the outcomes of the transmission service; and the presence of extremely low and high efficiencies. As an alternative to the current concept of benchmarking the Brazilian regulator uses, we propose a goal-oriented approach. Our proposal supports input/output selection by taking traditional organizational goals and measures as a basis for the selection of factors for benchmarking purposes. As the main advantage, it resolves the classical DEA problems of input/output selection, undesirable and dual-role factors. We also provide a demonstration of our goal-oriented concept regarding service quality. As a result, most TSOs’ efficiencies in Brazil might improve when considering quality as important in their efficiency estimation.Keywords: decision making, goal-oriented benchmarking, input/output factor determination, TSO regulation
Procedia PDF Downloads 1987928 A Critical Discourse Analysis of Jamaican and Trinidadian News Articles about D/Deafness
Authors: Melissa Angus Baboun
Abstract:
Utilizing a Critical Discourse Analysis (CDA) methodology and a theoretical framework based on disability studies, how Jamaican and Trinidadian newspapers discussed issues relating to the Deaf community were examined. The term deaf was inputted into the search engine tool of the online website for the Jamaica Observer and the Trinidad & Tobago Guardian. All 27 articles that contained the term deaf in its content and were written between August 1, 2017 and November 15, 2017 were chosen for the study. The data analysis was divided into three steps: (1) listing and analysis instances of metaphorical deafness (e.g. fall on deaf ears), (2) categorization of the content of the articles into the models of disability discourse (the medical, socio-cultural, and superscrip models of disability narratives), and (3) the analysis of any additional data found. A total of 42% of the articles pulled for this study did not deal with the Deaf community in any capacity, but rather instances of the use of idiomatic expressions that use deafness as a metaphor for a non-physical, undesirable trait. The most common idiomatic expression found was fall on deaf ears. Regarding the models of disability discourse, eight articles were found to follow the socio-cultural model, two were found to follow the medical model, and two were found to follow the superscrip model. The additional data found in these articles include two instances of the term deaf and mute, an overwhelming use of lower case d for the term deaf, and the misuse of the term translator (to mean interpreter).Keywords: deafness, disability, news coverage, Caribbean newspapers
Procedia PDF Downloads 2357927 LACGC: Business Sustainability Research Model for Generations Consumption, Creation, and Implementation of Knowledge: Academic and Non-Academic
Authors: Satpreet Singh
Abstract:
This paper introduces the new LACGC model to sustain the academic and non-academic business to future educational and organizational generations. The consumption of knowledge and the creation of new knowledge is a strength and focal interest of all academics and Non-academic organizations. Implementing newly created knowledge sustains the businesses to the next generation with growth without detriment. Existing models like the Scholar-practitioner model and Organization knowledge creation models focus specifically on academic or non-academic, not both. LACGC model can be used for both Academic and Non-academic at the domestic or international level. Researchers and scholars play a substantial role in finding literature and practice gaps in academic and non-academic disciplines. LACGC model has unrestricted the number of recurrences because the Consumption, Creation, and implementation of new ideas, disciplines, systems, and knowledge is a never-ending process and must continue from one generation to the next.Keywords: academics, consumption, creation, generations, non-academics, research, sustainability
Procedia PDF Downloads 1987926 Chemical Degradation of a Polyester Nonwoven Membrane Used in Aerosol and Drainage Filter
Authors: Rachid El Aidani, Phuong Nguyen-Tri, Toan Vu-Khanh
Abstract:
The filter media in synthetic fibre is the most geotextile materials used in aerosol and drainage filtration, particularly for buildings soil reinforcement in civil engineering due to its appropriated properties and its low cost. However, the current understanding of the durability and stability of this material in real service conditions, especially under severe long-term conditions are completely limited. This study has examined the effects of the chemical aging of a filter media in polyester nonwoven under different temperatures (50, 70 and 80˚C) and pH (2. 7 and 12). The effect of aging conditions on mechanical properties, morphology, permeability, thermal stability and molar weigh changes is investigated. The results showed a significant reduction of mechanical properties in term of tensile strength, puncture force and tearing forces of the filter media after chemical aging due to the chemical degradation. The molar mass and mechanical properties changes in different temperature and pH showed a complex dependence of material properties on environmental conditions. The SEM and AFM characterizations showed a significant impact of the thermal aging on the morphological properties of the fibres. Based on the obtained results, the lifetime of the material in different temperatures was determined by the use of the Arrhenius model. These results provide useful information to better understand phenomena occurring during chemical aging of the filter media and may help to predict the service lifetime of this material in real used conditions.Keywords: nonwoven membrane, chemical aging, mechanical properties, lifetime, filter media
Procedia PDF Downloads 3497925 Soap Film Enneper Minimal Surface Model
Authors: Yee Hooi Min, Mohdnasir Abdul Hadi
Abstract:
Tensioned membrane structure in the form of Enneper minimal surface can be considered as a sustainable development for the green environment and technology, it also can be used to support the effectiveness used of energy and the structure. Soap film in the form of Enneper minimal surface model has been studied. The combination of shape and internal forces for the purpose of stiffness and strength is an important feature of membrane surface. For this purpose, form-finding using soap film model has been carried out for Enneper minimal surface models with variables u=v=0.6 and u=v=1.0. Enneper soap film models with variables u=v=0.6 and u=v=1.0 provides an alternative choice for structural engineers to consider the tensioned membrane structure in the form of Enneper minimal surface applied in the building industry. It is expected to become an alternative building material to be considered by the designer.Keywords: Enneper, minimal surface, soap film, tensioned membrane structure
Procedia PDF Downloads 5587924 Employees and Their Perception of Soft Skills on Their Employability
Authors: Sukrita Mukherjee, Anindita Chaudhuri
Abstract:
Soft skills are a crucial aspect for employees, and these skills are not confined to any particular field rather, it guarantees further career growth and job opportunities for employees who are seeking growth. Soft skills are also regarded as personality-specific skills that are observable and are qualitative in nature, which determines an employee’s strengths as a leader. When an employee intends to hold his job, then the person must make effective use of his personal resources, that, in turn, impacts his employability in a positive manner. An employee at his workplace is expected to make effective use of his personal resources. The resources that are to be used by the employee are generally of two types. First type of resources are occupation related, which is related with the educational background of the employee, and the second type of resources are the psychological resources of the employee, such as self-knowledge, career orientation awareness, sense of purpose and emotional literacy, that are considered crucial for an employee in his workplace. The present study is a qualitative study which includes 10 individuals working in IT Sector and Service Industry, respectively. For IT sector, graduate people are considered, and for the Service Industry, individuals who have done a Professional course in order to get into the industry are considered. The emerging themes from the findings after thematic analysis reveal that different aspect of Soft skills such as communication, decision making, constant learning, keeping oneself updated with the latest technological advancement, emotional intelligence are some of the important factors that helps an employee not only to sustain his job, but also grow in his workplace.Keywords: employabiliy, soft skils, employees, resources, workplace
Procedia PDF Downloads 647923 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 447922 Modeling Biomass and Biodiversity across Environmental and Management Gradients in Temperate Grasslands with Deep Learning and Sentinel-1 and -2
Authors: Javier Muro, Anja Linstadter, Florian Manner, Lisa Schwarz, Stephan Wollauer, Paul Magdon, Gohar Ghazaryan, Olena Dubovyk
Abstract:
Monitoring the trade-off between biomass production and biodiversity in grasslands is critical to evaluate the effects of management practices across environmental gradients. New generations of remote sensing sensors and machine learning approaches can model grasslands’ characteristics with varying accuracies. However, studies often fail to cover a sufficiently broad range of environmental conditions, and evidence suggests that prediction models might be case specific. In this study, biomass production and biodiversity indices (species richness and Fishers’ α) are modeled in 150 grassland plots for three sites across Germany. These sites represent a North-South gradient and are characterized by distinct soil types, topographic properties, climatic conditions, and management intensities. Predictors used are derived from Sentinel-1 & 2 and a set of topoedaphic variables. The transferability of the models is tested by training and validating at different sites. The performance of feed-forward deep neural networks (DNN) is compared to a random forest algorithm. While biomass predictions across gradients and sites were acceptable (r2 0.5), predictions of biodiversity indices were poor (r2 0.14). DNN showed higher generalization capacity than random forest when predicting biomass across gradients and sites (relative root mean squared error of 0.5 for DNN vs. 0.85 for random forest). DNN also achieved high performance when using the Sentinel-2 surface reflectance data rather than different combinations of spectral indices, Sentinel-1 data, or topoedaphic variables, simplifying dimensionality. This study demonstrates the necessity of training biomass and biodiversity models using a broad range of environmental conditions and ensuring spatial independence to have realistic and transferable models where plot level information can be upscaled to landscape scale.Keywords: ecosystem services, grassland management, machine learning, remote sensing
Procedia PDF Downloads 2197921 Of an 80 Gbps Passive Optical Network Using Time and Wavelength Division Multiplexing
Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Faizan Khan, Xiaodong Yang
Abstract:
Internet Service Providers are driving endless demands for higher bandwidth and data throughput as new services and applications require higher bandwidth. Users want immediate and accurate data delivery. This article focuses on converting old conventional networks into passive optical networks based on time division and wavelength division multiplexing. The main focus of this research is to use a hybrid of time-division multiplexing and wavelength-division multiplexing to improve network efficiency and performance. In this paper, we design an 80 Gbps Passive Optical Network (PON), which meets the need of the Next Generation PON Stage 2 (NGPON2) proposed in this paper. The hybrid of the Time and Wavelength division multiplexing (TWDM) is said to be the best solution for the implementation of NGPON2, according to Full-Service Access Network (FSAN). To co-exist with or replace the current PON technologies, many wavelengths of the TWDM can be implemented simultaneously. By utilizing 8 pairs of wavelengths that are multiplexed and then transmitted over optical fiber for 40 Kms and on the receiving side, they are distributed among 256 users, which shows that the solution is reliable for implementation with an acceptable data rate. From the results, it can be concluded that the overall performance, Quality Factor, and bandwidth of the network are increased, and the Bit Error rate is minimized by the integration of this approach.Keywords: bit error rate, fiber to the home, passive optical network, time and wavelength division multiplexing
Procedia PDF Downloads 727920 Advancing Healthcare Access and Efficiency: Objectives of Telecare and Telehealth
Authors: Munachiso A. Muoneke
Abstract:
Telecare and telehealth are transformative innovations that expand healthcare access, improve service efficiency and enhance patient outcomes. This study explores the core objectives of telecare and telehealth, focusing on their role in overcoming geographical and logistical barriers to healthcare delivery. By leveraging digital platforms, remote monitoring tools and virtual consultations, these technologies address the pressing challenges of modern healthcare systems, including provider shortages and escalating costs. The research begins with an overview of telecare and telehealth objectives, highlighting their potential to bridge care gaps in underserved regions. It examines key methodologies, such as the integration of wearable devices, video conferencing and secure patient data platforms, to enhance real-time patient monitoring and personalized care. The study employs case studies and statistical analyses to compare patient outcomes in traditional care versus telehealth enables models. Preliminary findings demonstrate that telecare and telehealth significantly reduce hospital readmissions, improve chronic disease management and foster proactive health engagement. These outcomes emphasize the importance of aligning telehealth initiatives with healthcare policies and infrastructural investments. To conclude, the study lays emphasis on telecare and telehealth as indispensable tools for achieving universal health coverage. By fostering collaboration among healthcare providers, policy makers and technology developers, these innovations hold the potential to create a more accessible, efficient and patient-centered healthcare ecosystem.Keywords: healthcare access and efficiency, remote monitoring, virtual consultations, telecare and telehealth innovations
Procedia PDF Downloads 77919 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 3097918 Anti-Inflammatory, Analgesic and Antipyretic Activity of Terminalia arjuna Roxb. Extract in Animal Models
Authors: Linda Chularojmontri, Seewaboon Sireeratawong, Suvara Wattanapitayakul
Abstract:
Terminalia arjuna Roxb. (family Combretaceae) is commonly known as ‘Sa maw thet’ in Thai. The fruit is used in traditional medicine as natural mild laxatives, carminative and expectorant. Aim of the study: This research aims to study the anti-inflammatory, analgesic and antipyretic activities of Terminalia arjuna extract by using animal models in comparison to the reference drugs. Materials and Methods: The anti-inflammatory study was conducted by two experimental animal models namely ethyl phenylpropionate (EPP)-induced ear edema and carrageenan-induced paw edema. The study of analgesic activity used two methods of pain induction including acetic acid and heat-induced pain. In addition, the antipyretic activity study was performed by induced hyperthermia with yeast. Results: The results showed that the oral administration of Terminalia arjuna extract possessed acute anti-inflammatory effect in carrageenan-induced paw edema. Terminalia arjuna extract showed the analgesic activity in acetic acid-induced writhing response and heat-induced pain. This indicates its peripheral effect by inhibiting the biosynthesis and/or release of some pain mediators and some mechanism through Central nervous system. Moreover, Terminalia arjuna extract at the dose of 1000 and 1500 mg/kg body weight showed the antipyretic activity, which might be because of the inhibition of prostaglandins. Conclusion: The findings of this study indicated that the Terminalia arjuna extract possesses the anti-inflammatory, analgesic and antipyretic activities in animals.Keywords: analgesic activity, anti-inflammatory activity, antipyretic activity, Terminalia arjuna extract
Procedia PDF Downloads 2667917 Utilizing Federated Learning for Accurate Prediction of COVID-19 from CT Scan Images
Authors: Jinil Patel, Sarthak Patel, Sarthak Thakkar, Deepti Saraswat
Abstract:
Recently, the COVID-19 outbreak has spread across the world, leading the World Health Organization to classify it as a global pandemic. To save the patient’s life, the COVID-19 symptoms have to be identified. But using an AI (Artificial Intelligence) model to identify COVID-19 symptoms within the allotted time was challenging. The RT-PCR test was found to be inadequate in determining the COVID status of a patient. To determine if the patient has COVID-19 or not, a Computed Tomography Scan (CT scan) of patient is a better alternative. It will be challenging to compile and store all the data from various hospitals on the server, though. Federated learning, therefore, aids in resolving this problem. Certain deep learning models help to classify Covid-19. This paper will have detailed work of certain deep learning models like VGG19, ResNet50, MobileNEtv2, and Deep Learning Aggregation (DLA) along with maintaining privacy with encryption.Keywords: federated learning, COVID-19, CT-scan, homomorphic encryption, ResNet50, VGG-19, MobileNetv2, DLA
Procedia PDF Downloads 747916 Generation of High-Quality Synthetic CT Images from Cone Beam CT Images Using A.I. Based Generative Networks
Authors: Heeba A. Gurku
Abstract:
Introduction: Cone Beam CT(CBCT) images play an integral part in proper patient positioning in cancer patients undergoing radiation therapy treatment. But these images are low in quality. The purpose of this study is to generate high-quality synthetic CT images from CBCT using generative models. Material and Methods: This study utilized two datasets from The Cancer Imaging Archive (TCIA) 1) Lung cancer dataset of 20 patients (with full view CBCT images) and 2) Pancreatic cancer dataset of 40 patients (only 27 patients having limited view images were included in the study). Cycle Generative Adversarial Networks (GAN) and its variant Attention Guided Generative Adversarial Networks (AGGAN) models were used to generate the synthetic CTs. Models were evaluated by visual evaluation and on four metrics, Structural Similarity Index Measure (SSIM), Peak Signal Noise Ratio (PSNR) Mean Absolute Error (MAE) and Root Mean Square Error (RMSE), to compare the synthetic CT and original CT images. Results: For pancreatic dataset with limited view CBCT images, our study showed that in Cycle GAN model, MAE, RMSE, PSNR improved from 12.57to 8.49, 20.94 to 15.29 and 21.85 to 24.63, respectively but structural similarity only marginally increased from 0.78 to 0.79. Similar, results were achieved with AGGAN with no improvement over Cycle GAN. However, for lung dataset with full view CBCT images Cycle GAN was able to reduce MAE significantly from 89.44 to 15.11 and AGGAN was able to reduce it to 19.77. Similarly, RMSE was also decreased from 92.68 to 23.50 in Cycle GAN and to 29.02 in AGGAN. SSIM and PSNR also improved significantly from 0.17 to 0.59 and from 8.81 to 21.06 in Cycle GAN respectively while in AGGAN SSIM increased to 0.52 and PSNR increased to 19.31. In both datasets, GAN models were able to reduce artifacts, reduce noise, have better resolution, and better contrast enhancement. Conclusion and Recommendation: Both Cycle GAN and AGGAN were significantly able to reduce MAE, RMSE and PSNR in both datasets. However, full view lung dataset showed more improvement in SSIM and image quality than limited view pancreatic dataset.Keywords: CT images, CBCT images, cycle GAN, AGGAN
Procedia PDF Downloads 847915 Statistical Models and Time Series Forecasting on Crime Data in Nepal
Authors: Dila Ram Bhandari
Abstract:
Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.Keywords: time series analysis, forecasting, ARIMA, machine learning
Procedia PDF Downloads 1667914 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining
Authors: Mohsen Farhadloo, Majid Farhadloo
Abstract:
Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis
Procedia PDF Downloads 967913 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates
Authors: Bongs Lainjo
Abstract:
Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum
Procedia PDF Downloads 1767912 The Impact of an Improved Strategic Partnership Programme on Organisational Performance and Growth of Firms in the Internet Protocol Television and Hybrid Fibre-Coaxial Broadband Industry
Authors: Collen T. Masilo, Brane Semolic, Pieter Steyn
Abstract:
The Internet Protocol Television (IPTV) and Hybrid Fibre-Coaxial (HFC) Broadband industrial sector landscape are rapidly changing and organisations within the industry need to stay competitive by exploring new business models so that they can be able to offer new services and products to customers. The business challenge in this industrial sector is meeting or exceeding high customer expectations across multiple content delivery modes. The increasing challenges in the IPTV and HFC broadband industrial sector encourage service providers to form strategic partnerships with key suppliers, marketing partners, advertisers, and technology partners. The need to form enterprise collaborative networks poses a challenge for any organisation in this sector, in selecting the right strategic partners who will ensure that the organisation’s services and products are marketed in new markets. Partners who will ensure that customers are efficiently supported by meeting and exceeding their expectations. Lastly, selecting cooperation partners who will represent the organisation in a positive manner, and contribute to improving the performance of the organisation. Companies in the IPTV and HFC broadband industrial sector tend to form informal partnerships with suppliers, vendors, system integrators and technology partners. Generally, partnerships are formed without thorough analysis of the real reason a company is forming collaborations, without proper evaluations of prospective partners using specific selection criteria, and with ineffective performance monitoring of partners to ensure that a firm gains real long term benefits from its partners and gains competitive advantage. Similar tendencies are illustrated in the research case study and are based on Skyline Communications, a global leader in end-to-end, multi-vendor network management and operational support systems (OSS) solutions. The organisation’s flagship product is the DataMiner network management platform used by many operators across multiple industries and can be referred to as a smart system that intelligently manages complex technology ecosystems for its customers in the IPTV and HFC broadband industry. The approach of the research is to develop the most efficient business model that can be deployed to improve a strategic partnership programme in order to significantly improve the performance and growth of organisations participating in a collaborative network in the IPTV and HFC broadband industrial sector. This involves proposing and implementing a new strategic partnership model and its main features within the industry which should bring about significant benefits for all involved companies to achieve value add and an optimal growth strategy. The proposed business model has been developed based on the research of existing relationships, value chains and business requirements in this industrial sector and validated in 'Skyline Communications'. The outputs of the business model have been demonstrated and evaluated in the research business case study the IPTV and HFC broadband service provider 'Skyline Communications'.Keywords: growth, partnership, selection criteria, value chain
Procedia PDF Downloads 1347911 Patient Care Needs Assessment: An Evidence-Based Process to Inform Quality Care and Decision Making
Authors: Wynne De Jong, Robert Miller, Ross Riggs
Abstract:
Beyond the number of nurses providing care for patients, having nurses with the right skills, experience and education is essential to ensure the best possible outcomes for patients. Research studies continue to link nurse staffing and skill mix with nurse-sensitive patient outcomes; numerous studies clearly show that superior patient outcomes are associated with higher levels of regulated staff. Due to the limited number of tools and processes available to assist nurse leaders with staffing models of care, nurse leaders are constantly faced with the ongoing challenge to ensure their staffing models of care best suit their patient population. In 2009, several hospitals in Ontario, Canada participated in a research study to develop and evaluate an RN/RPN utilization toolkit. The purpose of this study was to develop and evaluate a toolkit for Registered Nurses/Registered Practical Nurses Staff mix decision-making based on the College of Nurses of Ontario, Canada practice standards for the utilization of RNs and RPNs. This paper will highlight how an organization has further developed the Patient Care Needs Assessment (PCNA) questionnaire, a major component of the toolkit. Moreover, it will demonstrate how it has utilized the information from PCNA to clearly identify patient and family care needs, thus providing evidence-based results to assist leaders with matching the best staffing skill mix to their patients.Keywords: nurse staffing models of care, skill mix, nursing health human resources, patient safety
Procedia PDF Downloads 3177910 Improvements of the Difficulty in Hospital Acceptance at the Scene by the Introduction of Smartphone Application for Emergency-Medical-Service System: A Population-Based Before-And-After Observation Study in Osaka City, Japan
Authors: Yusuke Katayama, Tetsuhisa Kitamura, Kosuke Kiyohara, Sumito Hayashida, Taku Iwami, Takashi Kawamura, Takeshi Shimazu
Abstract:
Background: Recently, the number of ambulance dispatches has been increasing in Japan and it is, therefore, difficult to accept emergency patients to hospitals smoothly and appropriately because of the limited hospital capacity. To facilitate the request for patient transport by ambulances and hospital acceptance, the emergency information system using information technology has been built up and introduced in various communities. However, its effectiveness has not been insufficiently revealed in Japan. In 2013, we developed a smartphone application system that enables the emergency-medical-service (EMS) personnel to share information about on-scene ambulance and hospital situation. The aim of this study was to assess the introduction effect of this application for EMS system in Osaka City, Japan. Methods: This study was a retrospective study with population-based ambulance records of Osaka Municipal Fire Department. This study period was six years from January 1, 2010 to December 31, 2015. In this study, we enrolled emergency patients that on-scene EMS personnel conducted the hospital selection for them. The main endpoint was difficulty in hospital acceptance at the scene. The definition of difficulty in hospital acceptance at the scene was to make >=5 phone calls by EMS personnel at the scene to each hospital until a decision to transport was determined. The definition of the smartphone application group was emergency patients transported in the period of 2013-2015 after the introduction of this application, and we assessed the introduction effect of smartphone application with multivariable logistic regression model. Results: A total of 600,526 emergency patients for whom EMS personnel selected hospitals were eligible for our analysis. There were 300,131 smartphone application group (50.0%) in 2010-2012 and 300,395 non-smartphone application group (50.0%) in 2013-2015. The proportion of the difficulty in hospital acceptance was 14.2% (42,585/300,131) in the smartphone application group and 10.9% (32,819/300,395) in the non-smartphone application group, and the difficulty in hospital acceptance significantly decreased by the introduction of the smartphone application (adjusted odds ration; 0.730, 95% confidence interval; 0.718-0.741, P<0.001). Conclusions: Sharing information between ambulance and hospital by introducing smartphone application at the scene was associated with decreasing the difficulty in hospital acceptance. Our findings may be considerable useful for developing emergency medical information system with using IT in other areas of the world.Keywords: difficulty in hospital acceptance, emergency medical service, infomation technology, smartphone application
Procedia PDF Downloads 2777909 Revisionism in Literature: Deconstructing Patriarchal Ideals in Margaret Atwood's The Penelopiad
Authors: Essam Abdelhamid Hegazy
Abstract:
This paper aims to read Margaret Atwood's The Penelopiad (2005) via a revisionist and deconstructive approach. This novel is a postmodernist exploration of the grand-narrative myth The Odyssey (800 BC) by Homer, who portrayed the heroic warrior and the faithful wife as the epitome of perfect male and female models _examples whom all must follow and mimic. In Atwood's narrative, the same two hero models are the two great tricksters who are willing to perform any sort of obnoxious act for achieving their goals. This research tries to examine how Atwood tried to synthesize the change in character’s narratives leading to the humanization of the perfect hero and the ideal wife. The researcher has used a multidisciplinary approach where the feminist, revisionist and deconstructive theories were implemented to identify and find out the new interpretations of the myths that center the experiences and perspectives of women. Research findings are that revisionist approach was applied through giving an opportunity to the victimized and the voiceless to speak out and retaliate against their prosecutions.Keywords: margret atwood, patriarchal, penelopiad, revisionism
Procedia PDF Downloads 847908 The Principle of Methodological Rationality and Security of Organisations
Authors: Jan Franciszek Jacko
Abstract:
This investigation presents the principle of methodological rationality of decision making and discusses the impact of an organisation's members' methodologically rational or irrational decisions on its security. This study formulates and partially justifies some research hypotheses regarding the impact. The thinking experiment is used according to Max Weber's ideal types method. Two idealised situations("models") are compared: Model A, whereall decision-makers follow methodologically rational decision-making procedures. Model B, in which these agents follow methodologically irrational decision-making practices. Analysing and comparing the two models will allow the formulation of some research hypotheses regarding the impact of methodologically rational and irrational attitudes of members of an organisation on its security. In addition to the method, phenomenological analyses of rationality and irrationality are applied.Keywords: methodological rationality, rational decisions, security of organisations, philosophy of economics
Procedia PDF Downloads 1437907 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation
Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner
Abstract:
This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.Keywords: business valuation, corporate finance, digitisation, disruption
Procedia PDF Downloads 1357906 Identification and Prioritisation of Students Requiring Literacy Intervention and Subsequent Communication with Key Stakeholders
Authors: Emilie Zimet
Abstract:
During networking and NCCD moderation meetings, best practices for identifying students who require Literacy Intervention are often discussed. Once these students are identified, consideration is given to the most effective process for prioritising those who have the greatest need for Literacy Support and the allocation of resources, tracking of intervention effectiveness and communicating with teachers/external providers/parents. Through a workshop, the group will investigate best practices to identify students who require literacy support and strategies to communicate and track their progress. In groups, participants will examine what they do in their settings and then compare with other models, including the researcher’s model, to decide the most effective path to identification and communication. Participants will complete a worksheet at the beginning of the session to deeply consider their current approaches. The participants will be asked to critically analyse their own identification processes for Literacy Intervention, ensuring students are not overlooked if they fall into the borderline category. A cut-off for students to access intervention will be considered so as not to place strain on already stretched resources along with the most effective allocation of resources. Furthermore, communicating learning needs and differentiation strategies to staff is paramount to the success of an intervention, and participants will look at the frequency of communication to share such strategies and updates. At the end of the session, the group will look at creating or evolving models that allow for best practices for the identification and communication of Literacy Interventions. The proposed outcome for this research is to develop a model of identification of students requiring Literacy Intervention that incorporates the allocation of resources and communication to key stakeholders. This will be done by pooling information and discussing a variety of models used in the participant's school settings.Keywords: identification, student selection, communication, special education, school policy, planning for intervention
Procedia PDF Downloads 48