Search results for: panel data analysis
39210 Sampling Error and Its Implication for Capture Fisheries Management in Ghana
Authors: Temiloluwa J. Akinyemi, Denis W. Aheto, Wisdom Akpalu
Abstract:
Capture fisheries in developing countries provide significant animal protein and directly supports the livelihoods of several communities. However, the misperception of biophysical dynamics owing to a lack of adequate scientific data has contributed to the suboptimal management in marine capture fisheries. This is because yield and catch potentials are sensitive to the quality of catch and effort data. Yet, studies on fisheries data collection practices in developing countries are hard to find. This study investigates the data collection methods utilized by fisheries technical officers within the four fishing regions of Ghana. We found that the officers employed data collection and sampling procedures which were not consistent with the technical guidelines curated by FAO. For example, 50 instead of 166 landing sites were sampled, while 290 instead of 372 canoes were sampled. We argue that such sampling errors could result in the over-capitalization of capture fish stocks and significant losses in resource rents.Keywords: Fisheries data quality, fisheries management, Ghana, Sustainable Fisheries
Procedia PDF Downloads 9439209 Changing New York Financial Clusters in the 2000s: Modeling the Impact and Policy Implication of the Global Financial Crisis
Authors: Silvia Lorenzo, Hongmian Gong
Abstract:
With the influx of research assessing the economic impact of the global financial crisis of 2007-8, a spatial analysis based on empirical data is needed to better understand the spatial significance of the financial crisis in New York, a key international financial center also considered the origin of the crisis. Using spatial statistics, the existence of financial clusters specializing in credit and securities throughout the New York metropolitan area are identified for 2000 and 2010, the time period before and after the height of the global financial crisis. Geographically Weighted Regressions are then used to examine processes underlying the formation and movement of financial geographies across state, county and ZIP codes of the New York metropolitan area throughout the 2000s with specific attention to tax regimes, employment, household income, technology, and transportation hubs. This analysis provides useful inputs for financial risk management and public policy initiatives aimed at addressing regional economic sustainability across state boundaries, while also developing the groundwork for further research on a spatial analysis of the global financial crisis.Keywords: financial clusters, New York, global financial crisis, geographically weighted regression
Procedia PDF Downloads 30939208 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data
Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill
Abstract:
Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function
Procedia PDF Downloads 27939207 Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform
Authors: Dogaru-Ulieru Valentin, Sălișteanu Ioan Corneliu, Ardeleanu Mihăiță Nicolae, Broscăreanu Ștefan, Sălișteanu Bogdan, Mihai Mihail
Abstract:
The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day.Keywords: DiaMOTO, Codec-8, ADAS, GPS, driver monitoring
Procedia PDF Downloads 7839206 Predictive Modeling of Bridge Conditions Using Random Forest
Authors: Miral Selim, May Haggag, Ibrahim Abotaleb
Abstract:
The aging of transportation infrastructure presents significant challenges, particularly concerning the monitoring and maintenance of bridges. This study investigates the application of Random Forest algorithms for predictive modeling of bridge conditions, utilizing data from the US National Bridge Inventory (NBI). The research is significant as it aims to improve bridge management through data-driven insights that can enhance maintenance strategies and contribute to overall safety. Random Forest is chosen for its robustness, ability to handle complex, non-linear relationships among variables, and its effectiveness in feature importance evaluation. The study begins with comprehensive data collection and cleaning, followed by the identification of key variables influencing bridge condition ratings, including age, construction materials, environmental factors, and maintenance history. Random Forest is utilized to examine the relationships between these variables and the predicted bridge conditions. The dataset is divided into training and testing subsets to evaluate the model's performance. The findings demonstrate that the Random Forest model effectively enhances the understanding of factors affecting bridge conditions. By identifying bridges at greater risk of deterioration, the model facilitates proactive maintenance strategies, which can help avoid costly repairs and minimize service disruptions. Additionally, this research underscores the value of data-driven decision-making, enabling better resource allocation to prioritize maintenance efforts where they are most necessary. In summary, this study highlights the efficiency and applicability of Random Forest in predictive modeling for bridge management. Ultimately, these findings pave the way for more resilient and proactive management of bridge systems, ensuring their longevity and reliability for future use.Keywords: data analysis, random forest, predictive modeling, bridge management
Procedia PDF Downloads 2239205 Hepatitis B, Hepatitis C and HIV Infections and Associated Risk Factors among Substance Abusers in Mekelle Substance Users Treatment and Rehabilitation Centers, Tigrai, Northern Ethiopia
Authors: Tadele Araya, Tsehaye Asmelash, Girmatsion Fiseha
Abstract:
Background: Hepatitis B virus (HBV), Hepatitis C virus (HCV) and Human Immunodeficiency Virus (HIV) constitute serious healthcare problems worldwide. Blood-borne pathogens HBV, HCV and HIV are commonly associated with infections among substance or Injection Drug Users (IDUs). The objective of this study was to determine the prevalence of HBV, HCV, and HIV infections among substance users in Mekelle Substance users Treatment and Rehabilitation Centers. Methods: A cross-sectional study design was used from Dec 2020 to Sep / 2021 to conduct the study. A total of 600 substance users were included. Data regarding the socio-demographic, clinical and sexual behaviors of the substance users were collected using a structured questionnaire. For laboratory analysis, 5-10 ml of venous blood was taken from the substance users. The laboratory analysis was performed by Enzyme-Linked Immunosorbent Assay (ELISA) at Mekelle University, Department of Medical Microbiology and Immunology Research Laboratory. The Data was analyzed using SPSS and Epi-data. The association of variables with HBV, HCV and HIV infections was determined using multivariate analysis and a P value < 0.05 was considered statistically significant. Result: The overall prevalence rate of HBV, HCV and HIV infections were 10%, 6.6%, and 7.5%, respectively. The mean age of the study participants was 28.12 ± 6.9. A higher prevalence of HBV infection was seen in participants who were users of drug injections and in those who were infected with HIV. HCV was comparatively higher in those who had a previous history of unsafe surgical procedures than their counterparts. Homeless participants were highly exposed to HCV and HIV infections than their counterparts. The HBV/HIV Co-infection prevalence was 3.5%. Those doing unprotected sexual practices [P= 0.03], Injection Drug users [P= 0.03], those who had an HBV-infected person in their family [P=0.02], infected with HIV [P= 0.025] were statistically associated with HBV infection. HCV was significantly associated with Substance users and previous history of unsafe surgical procedures [p=0.03, p=0.04), respectively. HIV was significantly associated with unprotected sexual practices and being homeless [p=0.045, p=0.05) respectively. Conclusion-The highly prevalent viral infection was HBV compared to others. There was a High prevalence of HBV/HIV co-infection. The presence of HBV-infected persons in a family, unprotected sexual practices and sharing of needles for drug injection were the risk factors associated with HBV, HIV, and HCV. Continuous health education and screening of the viral infection coupled with medical and psychological treatment is mandatory for the prevention and control of the infections.Keywords: hepatitis b virus, hepatitis c virus, HIV, substance users
Procedia PDF Downloads 8539204 Pterygium Recurrence Rate and Influencing Factors for Recurrence of Pterygium after Pterygium Surgery at an Eastern Thai University Hospital
Authors: Luksanaporn Krungkraipetch
Abstract:
Pterygium is a frequent ocular surface lesion that begins in the limbal conjunctiva within the palpebral fissure and spreads to the cornea. The lesion is more common in the nasal limbus than in the temporal, and it has a wing-like aspect. Indications for surgery, in decreasing order of significance, are growth over the corneal center, decreased vision due to corneal deformation, documented growth, sensations of discomfort, and esthetic concerns. The aim of this study is twofold: first, to determine the frequency of pterygium recurrence after surgery at the mentioned hospital, and second, to identify the factors that influence the recurrence of pterygium. The research design is a retrospective examination of 164 patient samples in an eastern Thai university hospital (Code 13766). Data analysis is descriptive statistics analysis, i.e., basic data details about pterygium surgery and the risk of recurrent pterygium, and for factor analysis, the inferential statistics chi-square and ANOVA are utilized. Twenty-four of the 164 patients who underwent surgery exhibited recurrent pterygium. Consequently, the incidence of recurrent pterygium after surgery was 14.6%. There were an equal number of men and women present. The participants' ages ranged from 41 to 60 years (62, 8 percent). According to the findings, the majority of patients were female (60.4%), over the age of 60 (51.2%), did not live near the beach (83.5%), did not have an underlying disease (92.1%), and 95.7% did not have any other eye problems. Gender (X² = 1.26, p = .289), age (X² = 5.86, p = .119), an address near the sea (X² = 3.30, p = .081)), underlying disease (X² = 0.54, p = .694), and eye disease (X² = 0.00, p = 1.00) had no effect on pterygium recurrence. Recurrences occurred in 79.1% of all surgical procedures and 11.6% of all patients using the bare sclera technique. The recurrence rate for conjunctival autografts was 20.9% for all procedures and 3.0% for all participants. Mitomycin-C and amniotic membrane transplant techniques had no recurrence following surgery. Comparing the surgeries done on people with recurrent pterygium did not show anything important (F = 1.13, p = 0.339). In conclusion, the prevalence of pterygium recurrence following pterygium, 14.6%, does not differ from earlier research. Underlying disease, other eye conditions, and surgical procedures such as pterygium recurrence are unaffected by pterygium surgery.Keywords: pterygium, recurrence pterygium, pterygium surgery, excision pterygium
Procedia PDF Downloads 7339203 The Relationships between the Feelings of Bullying, Self- Esteem, Employee Silence, Anger, Self- Blame and Shame
Authors: Şebnem Aslan, Demet Akarçay
Abstract:
The objective of this study is to investigate the feelings of health employees occurred by bullying and the relationships between these feelings at work place. In this context, the relationships between bullying and the feelings of self-esteem, employee silence, anger, self- blame and shame. This study was conducted among 512 health employees in three hospitals in Konya by using survey method and simple random sampling. The scales of bullying, self-esteem, employee silence, anger, self-blame, and shame were performed within the study. The obtained data were analyzed with descriptive analysis, correlation, confirmative factor analysis, structural equation modeling and path analysis. The results of the study showed that while bullying had a positive effect on self-esteem (.61), employee silence (.41), anger (.18), a negative effect on self-blame and shame (-.26) was observed. Employee silence affected self-blame and shame (.83) as positively. Besides, self-esteem impacted on self- blame and shame (.18), employee silence (.62) positively and self-blame and shame was observed as negatively affecting on anger (-.20). Similarly, self-esteem was found as negatively affected on anger (-.13).Keywords: bullying, self-esteem, employee silence, anger, shame and guilt, healthcare employee
Procedia PDF Downloads 29739202 Improvement of Data Transfer over Simple Object Access Protocol (SOAP)
Authors: Khaled Ahmed Kadouh, Kamal Ali Albashiri
Abstract:
This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.Keywords: JAX-WS, SMTP, SOAP, web service, XML
Procedia PDF Downloads 49539201 Quantitative Analysis of the Functional Characteristics of Urban Complexes Based on Station-City Integration: Fifteen Case Studies of European, North American, and East Asian Railway Stations
Authors: Dai Yizheng, Chen-Yang Zhang
Abstract:
As station-city integration has been widely accepted as a strategy for mixed-use development, a quantitative analysis of the functional characteristics of urban complexes based on station-city integration is urgently needed. Taking 15 railway stations in European, North American, and East Asian cities as the research objects, this study analyzes their functional proportion, functional positioning, and functional correlation with respect to four categories of functional facilities for both railway passenger flow and subway passenger flow. We found that (1) the functional proportion of urban complexes was mainly concentrated in three models: complementary, dominant, and equilibrium. (2) The mathematical model affected by the functional proportion was created to evaluate the functional positioning of an urban complex at three scales: station area, city, and region. (3) The strength of the correlation between the functional area and passenger flow was revealed via data analysis using Pearson’s correlation coefficient. Finally, the findings of this study provide a valuable reference for research on similar topics in other countries that are developing station-city integration.Keywords: urban complex, station-city integration, mixed-use, function, quantitative analysis
Procedia PDF Downloads 11539200 Does Pakistan Stock Exchange Offer Diversification Benefits to Regional and International Investors: A Time-Frequency (Wavelets) Analysis
Authors: Syed Jawad Hussain Shahzad, Muhammad Zakaria, Mobeen Ur Rehman, Saniya Khaild
Abstract:
This study examines the co-movement between the Pakistan, Indian, S&P 500 and Nikkei 225 stock markets using weekly data from 1998 to 2013. The time-frequency relationship between the selected stock markets is conducted by using measures of continuous wavelet power spectrum, cross-wavelet transform and cross (squared) wavelet coherency. The empirical evidence suggests strong dependence between Pakistan and Indian stock markets. The co-movement of Pakistani index with U.S and Japanese, the developed markets, varies over time and frequency where the long-run relationship is dominant. The results of cross wavelet and wavelet coherence analysis indicate moderate covariance and correlation between stock indexes and the markets are in phase (i.e. cyclical in nature) over varying durations. Pakistan stock market was lagging during the entire period in relation to Indian stock market, corresponding to the 8~32 and then 64~256 weeks scale. Similar findings are evident for S&P 500 and Nikkei 225 indexes, however, the relationship occurs during the later period of study. All three wavelet indicators suggest strong evidence of higher co-movement during 2008-09 global financial crises. The empirical analysis reveals a strong evidence that the portfolio diversification benefits vary across frequencies and time. This analysis is unique and have several practical implications for regional and international investors while assigning the optimal weightage of different assets in portfolio formulation.Keywords: co-movement, Pakistan stock exchange, S&P 500, Nikkei 225, wavelet analysis
Procedia PDF Downloads 35839199 Enhancing Healthcare Data Protection and Security
Authors: Joseph Udofia, Isaac Olufadewa
Abstract:
Everyday, the size of Electronic Health Records data keeps increasing as new patients visit health practitioner and returning patients fulfil their appointments. As these data grow, so is their susceptibility to cyber-attacks from criminals waiting to exploit this data. In the US, the damages for cyberattacks were estimated at $8 billion (2018), $11.5 billion (2019) and $20 billion (2021). These attacks usually involve the exposure of PII. Health data is considered PII, and its exposure carry significant impact. To this end, an enhancement of Health Policy and Standards in relation to data security, especially among patients and their clinical providers, is critical to ensure ethical practices, confidentiality, and trust in the healthcare system. As Clinical accelerators and applications that contain user data are used, it is expedient to have a review and revamp of policies like the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), the Fast Healthcare Interoperability Resources (FHIR), all aimed to ensure data protection and security in healthcare. FHIR caters for healthcare data interoperability, FHIR caters to healthcare data interoperability, as data is being shared across different systems from customers to health insurance and care providers. The astronomical cost of implementation has deterred players in the space from ensuring compliance, leading to susceptibility to data exfiltration and data loss on the security accuracy of protected health information (PHI). Though HIPAA hones in on the security accuracy of protected health information (PHI) and PCI DSS on the security of payment card data, they intersect with the shared goal of protecting sensitive information in line with industry standards. With advancements in tech and the emergence of new technology, it is necessary to revamp these policies to address the complexity and ambiguity, cost barrier, and ever-increasing threats in cyberspace. Healthcare data in the wrong hands is a recipe for disaster, and we must enhance its protection and security to protect the mental health of the current and future generations.Keywords: cloud security, healthcare, cybersecurity, policy and standard
Procedia PDF Downloads 9239198 Three-Dimensional Numerical Investigation for Reinforced Concrete Slabs with Opening
Authors: Abdelrahman Elsehsah, Hany Madkour, Khalid Farah
Abstract:
This article presents a 3-D modified non-linear elastic model in the strain space. The Helmholtz free energy function is introduced with the existence of a dissipation potential surface in the space of thermodynamic conjugate forces. The constitutive equation and the damage evolution were derived as well. The modified damage has been examined to model the nonlinear behavior of reinforced concrete (RC) slabs with an opening. A parametric study with RC was carried out to investigate the impact of different factors on the behavior of RC slabs. These factors are the opening area, the opening shape, the place of opening, and the thickness of the slabs. And the numerical results have been compared with the experimental data from literature. Finally, the model showed its ability to be applied to the structural analysis of RC slabs.Keywords: damage mechanics, 3-D numerical analysis, RC, slab with opening
Procedia PDF Downloads 17539197 Channels Splitting Strategy for Optical Local Area Networks of Passive Star Topology
Authors: Peristera Baziana
Abstract:
In this paper, we present a network configuration for a WDM LANs of passive star topology that assume that the set of data WDM channels is split into two separate sets of channels, with different access rights over them. Especially, a synchronous transmission WDMA access algorithm is adopted in order to increase the probability of successful transmission over the data channels and consequently to reduce the probability of data packets transmission cancellation in order to avoid the data channels collisions. Thus, a control pre-transmission access scheme is followed over a separate control channel. An analytical Markovian model is studied and the average throughput is mathematically derived. The performance is studied for several numbers of data channels and various values of control phase duration.Keywords: access algorithm, channels division, collisions avoidance, wavelength division multiplexing
Procedia PDF Downloads 29639196 An Analytical Formulation of Pure Shear Boundary Condition for Assessing the Response of Some Typical Sites in Mumbai
Authors: Raj Banerjee, Aniruddha Sengupta
Abstract:
An earthquake event, associated with a typical fault rupture, initiates at the source, propagates through a rock or soil medium and finally daylights at a surface which might be a populous city. The detrimental effects of an earthquake are often quantified in terms of the responses of superstructures resting on the soil. Hence, there is a need for the estimation of amplification of the bedrock motions due to the influence of local site conditions. In the present study, field borehole log data of Mangalwadi and Walkeswar sites in Mumbai city are considered. The data consists of variation of SPT N-value with the depth of soil. A correlation between shear wave velocity (Vₛ) and SPT N value for various soil profiles of Mumbai city has been developed using various existing correlations which is used further for site response analysis. MATLAB program is developed for studying the ground response analysis by performing two dimensional linear and equivalent linear analysis for some of the typical Mumbai soil sites using pure shear (Multi Point Constraint) boundary condition. The model is validated in linear elastic and equivalent linear domain using the popular commercial program, DEEPSOIL. Three actual earthquake motions are selected based on their frequency contents and durations and scaled to a PGA of 0.16g for the present ground response analyses. The results are presented in terms of peak acceleration time history with depth, peak shear strain time history with depth, Fourier amplitude versus frequency, response spectrum at the surface etc. The peak ground acceleration amplification factors are found to be about 2.374, 3.239 and 2.4245 for Mangalwadi site and 3.42, 3.39, 3.83 for Walkeswar site using 1979 Imperial Valley Earthquake, 1989 Loma Gilroy Earthquake and 1987 Whitter Narrows Earthquake, respectively. In the absence of any site-specific response spectrum for the chosen sites in Mumbai, the generated spectrum at the surface may be utilized for the design of any superstructure at these locations.Keywords: deepsoil, ground response analysis, multi point constraint, response spectrum
Procedia PDF Downloads 18039195 'CardioCare': A Cutting-Edge Fusion of IoT and Machine Learning to Bridge the Gap in Cardiovascular Risk Management
Authors: Arpit Patil, Atharav Bhagwat, Rajas Bhope, Pramod Bide
Abstract:
This research integrates IoT and ML to predict heart failure risks, utilizing the Framingham dataset. IoT devices gather real-time physiological data, focusing on heart rate dynamics, while ML, specifically Random Forest, predicts heart failure. Rigorous feature selection enhances accuracy, achieving over 90% prediction rate. This amalgamation marks a transformative step in proactive healthcare, highlighting early detection's critical role in cardiovascular risk mitigation. Challenges persist, necessitating continual refinement for improved predictive capabilities.Keywords: cardiovascular diseases, internet of things, machine learning, cardiac risk assessment, heart failure prediction, early detection, cardio data analysis
Procedia PDF Downloads 1239194 Role of Energy Storage in Renewable Electricity Systems in The Gird of Ethiopia
Authors: Dawit Abay Tesfamariam
Abstract:
Ethiopia’s Climate- Resilient Green Economy (ECRGE) strategy focuses mainly on generating and proper utilization of renewable energy (RE). Nonetheless, the current electricity generation of the country is dominated by hydropower. The data collected in 2016 by Ethiopian Electric Power (EEP) indicates that the intermittent RE sources from solar and wind energy were only 8 %. On the other hand, the EEP electricity generation plan in 2030 indicates that 36.1 % of the energy generation share will be covered by solar and wind sources. Thus, a case study was initiated to model and compute the balance and consumption of electricity in three different scenarios: 2016, 2025, and 2030 using the EnergyPLAN Model (EPM). Initially, the model was validated using the 2016 annual power-generated data to conduct the EnergyPLAN (EP) analysis for two predictive scenarios. The EP simulation analysis using EPM for 2016 showed that there was no significant excess power generated. Thus, the EPM was applied to analyze the role of energy storage in RE in Ethiopian grid systems. The results of the EP simulation analysis showed there will be excess production of 402 /7963 MW average and maximum, respectively, in 2025. The excess power was in the three rainy months of the year (June, July, and August). The outcome of the model also showed that in the dry seasons of the year, there would be excess power production in the country. Consequently, based on the validated outcomes of EP indicates, there is a good reason to think about other alternatives for the utilization of excess energy and storage of RE. Thus, from the scenarios and model results obtained, it is realistic to infer that if the excess power is utilized with a storage system, it can stabilize the grid system and be exported to support the economy. Therefore, researchers must continue to upgrade the current and upcoming storage system to synchronize with potentials that can be generated from renewable energy.Keywords: renewable energy, power, storage, wind, energy plan
Procedia PDF Downloads 7739193 Transferable Knowledge: Expressing Lessons Learnt from Failure to Outsiders
Authors: Stijn Horck
Abstract:
Background: The value of lessons learned from failure increases when these insights can be put to use by those who did not experience the failure. While learning from others has mostly been researched between individuals or teams within the same environment, transferring knowledge from the person who experienced the failure to an outsider comes with extra challenges. As sense-making of failure is an individual process leading to different learning experiences, the potential of lessons learned from failure is highly variable depending on who is transferring the lessons learned. Using an integrated framework of linguistic aspects related to attributional egotism, this study aims to offer a complete explanation of the challenges in transferring lessons learned from failures that are experienced by others. Method: A case study of a failed foundation established to address the information needs for GPs in times of COVID-19 has been used. An overview of failure causes and lessons learned were made through a preliminary analysis of data collected in two phases with metaphoric examples of failure types. This was followed up by individual narrative interviews with the board members who have all experienced the same events to analyse the individual variance of lessons learned through discourse analysis. This research design uses the researcher-as-instrument approach since the recipient of these lessons learned is the author himself. Results: Thirteen causes were given why the foundation has failed, and nine lessons were formulated. Based on the individually emphasized events, the explanation of the failure events mentioned by all or three respondents consisted of more linguistic aspects related to attributional egotism than failure events mentioned by only one or two. Moreover, the learning events mentioned by all or three respondents involved lessons learned that are based on changed insight, while the lessons expressed by only one or two are more based on direct value. Retrospectively, the lessons expressed as a group in the first data collection phase seem to have captured some but not all of the direct value lessons. Conclusion: Individual variance in expressing lessons learned to outsiders can be reduced using metaphoric or analogical explanations from a third party. In line with the attributional egotism theory, individuals separated from a group that has experienced the same failure are more likely to refer to failure causes of which the chances to be contradicted are the smallest. Lastly, this study contributes to the academic literature by demonstrating that the use of linguistic analysis is suitable for investigating the knowledge transfer from lessons learned after failure.Keywords: failure, discourse analysis, knowledge transfer, attributional egotism
Procedia PDF Downloads 11539192 Presenting a Model Of Empowering New Knowledge-based Companies In Iran Insurance Industry
Authors: Pedram Saadati, Zahra Nazari
Abstract:
In the last decade, the role and importance of knowledge-based technological businesses in the insurance industry has greatly increased, and due to the weakness of previous studies in Iran, the current research deals with the design of the InsurTech empowerment model. In order to obtain the conceptual model of the research, a hybrid framework has been used. The statistical population of the research in the qualitative part were experts, and in the quantitative part, the InsurTech activists. The tools of data collection in the qualitative part were in-depth and semi-structured interviews and structured self-interaction matrix, and in the quantitative part, a researcher-made questionnaire. In the qualitative part, 55 indicators, 20 components and 8 concepts (dimensions) were obtained by the content analysis method, then the relationships of the concepts with each other and the levels of the components were investigated. In the quantitative part, the information was analyzed using the descriptive analytical method in the way of path analysis and confirmatory factor analysis. The proposed model consists of eight dimensions of supporter capability, supervisor of insurance innovation ecosystem, managerial, financial, technological, marketing, opportunity identification, innovative InsurTech capabilities. The results of statistical tests in identifying the relationships of the concepts with each other have been examined in detail and suggestions have been presented in the conclusion section.Keywords: insurTech, knowledge-base, empowerment model, factor analysis, insurance
Procedia PDF Downloads 4639191 The Use of Random Set Method in Reliability Analysis of Deep Excavations
Authors: Arefeh Arabaninezhad, Ali Fakher
Abstract:
Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty
Procedia PDF Downloads 26839190 Analyzing Tools and Techniques for Classification In Educational Data Mining: A Survey
Authors: D. I. George Amalarethinam, A. Emima
Abstract:
Educational Data Mining (EDM) is one of the newest topics to emerge in recent years, and it is concerned with developing methods for analyzing various types of data gathered from the educational circle. EDM methods and techniques with machine learning algorithms are used to extract meaningful and usable information from huge databases. For scientists and researchers, realistic applications of Machine Learning in the EDM sectors offer new frontiers and present new problems. One of the most important research areas in EDM is predicting student success. The prediction algorithms and techniques must be developed to forecast students' performance, which aids the tutor, institution to boost the level of student’s performance. This paper examines various classification techniques in prediction methods and data mining tools used in EDM.Keywords: classification technique, data mining, EDM methods, prediction methods
Procedia PDF Downloads 11739189 Corporate Cultures Management towards the Retention of Employees: Case Study Company in Thailand
Authors: Duangsamorn Rungsawanpho
Abstract:
The objectives of this paper are to explore the corporate cultures management as determinants of employee retention company in Thailand. This study using mixed method methodology. Data collection using questionnaires and in-depth interviews. The statistics used for data analysis were percentage, mean, standard deviation and inferential statistics will include. The results show that the corporate management culture is perfect for any organization but it depends on the business and the industry because the situations or circumstances that corporate executives are met is different. Because the finding explained that the employees of the company determine the achievement of value-oriented by the corporate culture and international relations is perceived most value for their organizations. In additional we found the employees perceiving with participation can be interpreted as a positive example, many employees feel that they are part of management because they care about their opinions or ideas related with their work.Keywords: corporate culture, employee retention, retention of employees, management approaches
Procedia PDF Downloads 30339188 Automatic Threshold Search for Heat Map Based Feature Selection: A Cancer Dataset Analysis
Authors: Carlos Huertas, Reyes Juarez-Ramirez
Abstract:
Public health is one of the most critical issues today; therefore, there is great interest to improve technologies in the area of diseases detection. With machine learning and feature selection, it has been possible to aid the diagnosis of several diseases such as cancer. In this work, we present an extension to the Heat Map Based Feature Selection algorithm, this modification allows automatic threshold parameter selection that helps to improve the generalization performance of high dimensional data such as mass spectrometry. We have performed a comparison analysis using multiple cancer datasets and compare against the well known Recursive Feature Elimination algorithm and our original proposal, the results show improved classification performance that is very competitive against current techniques.Keywords: biomarker discovery, cancer, feature selection, mass spectrometry
Procedia PDF Downloads 33839187 Understanding Natural Resources Governance in Canada: The Role of Institutions, Interests, and Ideas in Alberta's Oil Sands Policy
Authors: Justine Salam
Abstract:
As a federal state, Canada’s constitutional arrangements regarding the management of natural resources is unique because it gives complete ownership and control of natural resources to the provinces (subnational level). However, the province of Alberta—home to the third largest oil reserves in the world—lags behind comparable jurisdictions in levying royalties on oil corporations, especially oil sands royalties. While Albertans own the oil sands, scholars have argued that natural resource exploitation in Alberta benefits corporations and industry more than it does Albertans. This study provides a systematic understanding of the causal factors affecting royalties in Alberta to map dynamics of power and how they manifest themselves during policy-making. Mounting domestic and global public pressure led Alberta to review its oil sands royalties twice in less than a decade through public-commissioned Royalty Review Panels, first in 2007 and again in 2015. The Panels’ task was to research best practices and to provide policy recommendations to the Government through public consultations with Albertans, industry, non-governmental organizations, and First Nations peoples. Both times, the Panels recommended a relative increase to oil sands royalties. However, irrespective of the Reviews’ recommendations, neither the right-wing 2007 Progressive Conservative Party (PC) nor the left-wing 2015 New Democratic Party (NDP) government—both committed to increase oil sands royalties—increased royalty intake. Why did two consecutive political parties at opposite ends of the political spectrum fail to account for the recommendations put forward by the Panel? Through a qualitative case-study analysis, this study assesses domestic and global causal factors for Alberta’s inability to raise oil sands royalties significantly after the two Reviews through an institutions, interests, and ideas framework. Indeed, causal factors can be global (e.g. market and price fluctuation) or domestic (e.g. oil companies’ influence on the Alberta government). The institutions, interests, and ideas framework is at the intersection of public policy, comparative studies, and political economy literatures, and therefore draws multi-faceted insights into the analysis. To account for institutions, the study proposes to review international trade agreements documents such as the North American Free Trade Agreement (NAFTA) because they have embedded Alberta’s oil sands into American energy security policy and tied Canadian and Albertan oil policy in legal international nods. To account for interests, such as how the oil lobby or the environment lobby can penetrate governmental decision-making spheres, the study draws on the Oil Sands Oral History project, a database of interviews from government officials and oil industry leaders at a pivotal time in Alberta’s oil industry, 2011-2013. Finally, to account for ideas, such as how narratives of Canada as a global ‘energy superpower’ and the importance of ‘energy security’ have dominated and polarized public discourse, the study relies on content analysis of Alberta-based pro-industry newspapers to trace the prevalence of these narratives. By mapping systematically the nods and dynamics of power at play in Alberta, the study sheds light on the factors that influence royalty policy-making in one of the largest industries in Canada.Keywords: Alberta Canada, natural resources governance, oil sands, political economy
Procedia PDF Downloads 13239186 Comparative Study of Equivalent Linear and Non-Linear Ground Response Analysis for Rapar District of Kutch, India
Authors: Kulin Dave, Kapil Mohan
Abstract:
Earthquakes are considered to be the most destructive rapid-onset disasters human beings are exposed to. The amount of loss it brings in is sufficient to take careful considerations for designing of structures and facilities. Seismic Hazard Analysis is one such tool which can be used for earthquake resistant design. Ground Response Analysis is one of the most crucial and decisive steps for seismic hazard analysis. Rapar district of Kutch, Gujarat falls in Zone 5 of earthquake zone map of India and thus has high seismicity because of which it is selected for analysis. In total 8 bore-log data were studied at different locations in and around Rapar district. Different soil engineering properties were analyzed and relevant empirical correlations were used to calculate maximum shear modulus (Gmax) and shear wave velocity (Vs) for the soil layers. The soil was modeled using Pressure-Dependent Modified Kodner Zelasko (MKZ) model and the reference curve used for fitting was Seed and Idriss (1970) for sand and Darendeli (2001) for clay. Both Equivalent linear (EL), as well as Non-linear (NL) ground response analysis, has been carried out with Masing Hysteretic Re/Unloading formulation for comparison. Commercially available DEEPSOIL v. 7.0 software is used for this analysis. In this study an attempt is made to quantify ground response regarding generated acceleration time-history at top of the soil column, Response spectra calculation at 5 % damping and Fourier amplitude spectrum calculation. Moreover, the variation of Peak Ground Acceleration (PGA), Maximum Displacement, Maximum Strain (in %), Maximum Stress Ratio, Mobilized Shear Stress with depth is also calculated. From the study, PGA values estimated in rocky strata are nearly same as bedrock motion and marginal amplification is observed in sandy silt and silty clays by both analyses. The NL analysis gives conservative results of maximum displacement as compared to EL analysis. Maximum strain predicted by both studies is very close to each other. And overall NL analysis is more efficient and realistic because it follows the actual hyperbolic stress-strain relationship, considers stiffness degradation and mobilizes stresses generated due to pore water pressure.Keywords: DEEPSOIL v 7.0, ground response analysis, pressure-dependent modified Kodner Zelasko model, MKZ model, response spectra, shear wave velocity
Procedia PDF Downloads 13639185 A Concept of Data Mining with XML Document
Authors: Akshay Agrawal, Anand K. Srivastava
Abstract:
The increasing amount of XML datasets available to casual users increases the necessity of investigating techniques to extract knowledge from these data. Data mining is widely applied in the database research area in order to extract frequent correlations of values from both structured and semi-structured datasets. The increasing availability of heterogeneous XML sources has raised a number of issues concerning how to represent and manage these semi structured data. In recent years due to the importance of managing these resources and extracting knowledge from them, lots of methods have been proposed in order to represent and cluster them in different ways.Keywords: XML, similarity measure, clustering, cluster quality, semantic clustering
Procedia PDF Downloads 38439184 Speed-Up Data Transmission by Using Bluetooth Module on Gas Sensor Node of Arduino Board
Authors: Hiesik Kim, YongBeum Kim
Abstract:
Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to speed up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group(SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as Open source hardware, Gas sensor, and Bluetooth Module and algorithm controlling transmission speed is demonstrated. Experiment controlling transmission speed also is progressed by developing Android Application receiving measured data, and controlling this speed is available at the experiment result. it is important that in the future, improvement for communication algorithm be needed because few error occurs when data is transferred or received.Keywords: Arduino, Bluetooth, gas sensor, internet of things, transmission Speed
Procedia PDF Downloads 48339183 A Study of Microglitches in Hartebeesthoek Radio Pulsars
Authors: Onuchukwu Chika Christian, Chukwude Augustine Ejike
Abstract:
We carried out a statistical analyse of microglitches events on a sample of radio pulsars. The distribution of microglitch events in frequency (ν) and first frequency derivatives ν˙ indicates that the size of a microglitch and sign combinations of events in ν and ν˙ are purely randomized. Assuming that the probability of a given size of a microglitch event occurring scales inversely as the absolute size of the event in both ν and ν˙, we constructed a cumulative distribution function (CDF) for the absolute sizes of microglitches. In most of the pulsars, the theoretical CDF matched the observed values. This is an indication that microglitches in pulsar may be interpreted as an avalanche process in which angular momentum is transferred erratically from the flywheel-like superfliud interior to the slowly decelerating solid crust. Analysis of the waiting time indicates that it is purely Poisson distributed with mean microglitch rate <γ> ∼ 0.98year^−1 for all the pulsars in our sample and <γ> / <∆T> ∼ 1. Correlation analysis, showed that the relative absolute size of microglitch event strongly with the rotation period of the pulsar with correlation coefficient r ∼ 0.7 and r ∼ 0.5 respectively for events in ν and ν˙. The mean glitch rate and number of microglitches (Ng) showed some dependence on spin down rate (r ∼ −0.6) and the characteristic age of the pulsar (τ) with (r ∼ −0.4/− 0.5).Keywords: method-data analysis, star, neutron-pulsar, general
Procedia PDF Downloads 46039182 Detection of Selected Heavy Metals in Raw Milk: Lahore, Pakistan
Authors: Huma Naeem, Saif-Ur-Rehman Kashif, Muhammad Nawaz Chaudhry
Abstract:
Milk plays a significant role in the dietary requirements of human beings as it is a single source that provides various essential nutrients. A study was conducted to evaluate the heavy metal concentration in the raw milk marketed in Data Gunj Baksh Town of Lahore. A total of 180 samples of raw milk were collected in pre-monsoon, monsoon and post-monsoon season from five colonies of Data Gunj Baksh Town, Lahore. The milk samples were subjected to heavy metal analysis (Cr, Cu) by atomic absorption spectrophotometer. Results indicated high levels of Cr and Cu in post-monsoon seasons. Heavy metals were detected in milk in all samples under study and exceeded the standards given by FAO.Keywords: atomic absorption spectrophotometer, chromium, copper, heavy metal
Procedia PDF Downloads 43439181 Evaluating the Total Costs of a Ransomware-Resilient Architecture for Healthcare Systems
Authors: Sreejith Gopinath, Aspen Olmsted
Abstract:
This paper is based on our previous work that proposed a risk-transference-based architecture for healthcare systems to store sensitive data outside the system boundary, rendering the system unattractive to would-be bad actors. This architecture also allows a compromised system to be abandoned and a new system instance spun up in place to ensure business continuity without paying a ransom or engaging with a bad actor. This paper delves into the details of various attacks we simulated against the prototype system. In the paper, we discuss at length the time and computational costs associated with storing and retrieving data in the prototype system, abandoning a compromised system, and setting up a new instance with existing data. Lastly, we simulate some analytical workloads over the data stored in our specialized data storage system and discuss the time and computational costs associated with running analytics over data in a specialized storage system outside the system boundary. In summary, this paper discusses the total costs of data storage, access, and analytics incurred with the proposed architecture.Keywords: cybersecurity, healthcare, ransomware, resilience, risk transference
Procedia PDF Downloads 133