Search results for: diagrams and statistical tables
3608 Impact of Zinc on Heavy Metals Content, Polyphenols and Antioxidant Capacity of Faba Bean in Milk Ripeness
Authors: M. Timoracká, A. Vollmannová., D.S. Ismael, J. Musilová
Abstract:
We investigated the effect of targeted contaminated soil by Zn model conditions. The soil used in the pot trial was uncontaminated. Faba beans (cvs Saturn, Zobor) were harvested in milk ripeness. With increased doses applied into the soil the strong statistical relationship between soil Zn content and Zn amount in seeds of both of faba bean cultivars was confirmed. Despite the high Zn doses applied into the soil in model conditions, in all variants the determined Zn amount in faba bean cv. Saturn was just below the maximal allowed content in foodstuffs given by the legislative. In cv. Zobor the determined Zn content was higher than maximal allowed amount (by 2% and 12%, respectively). Faba bean cvs. Saturn and Zobor accumulated (in all variants higher than hygienic limits) high amounts of Pb and Cd. The contents of all other heavy metals were lower than hygienic limits. With increased Zn doses applied into the soil the total polyphenols contents as well as the total antioxidant capacity determined in seeds of both cultivars Saturn and Zobor were increased. The strong statistical relationship between soil Zn content and the total polyphenols contents as well as the total antioxidant capacity in seeds of faba bean cultivars was confirmed.Keywords: antioxidant capacity, faba bean, polyphenols, zinc
Procedia PDF Downloads 3953607 Perception of Faculties Towards Online Teaching-Learning Activities during COVID-19 Pandemic: A Cross-Sectional Study at a Tertiary Care Center in Eastern Nepal
Authors: Deependra Prasad Sarraf, Gajendra Prasad Rauniar, Robin Maskey, Rajiv Maharjan, Ashish Shrestha, Ramayan Prasad Kushwaha
Abstract:
Objectives: To assess the perception of faculties towards online teaching-learning activities conducted during the COVID-19 pandemic and to identify barriers and facilitators to conducting online teaching-learning activities in our context. Methods: A cross-sectional study was conducted among faculties at B. P. Koirala Institute of Health Sciences using a 26-item semi-structured questionnaire. A Google Form was prepared, and its link was sent to the faculties via email. Descriptive statistics were calculated, and findings were presented as tables and graphs. Results: Out of 158 faculties, the majority were male (66.46%), medical faculties (85.44%), and assistant professors (46.84%). Only 16 (10.13%) faculties had received formal training regarding preparing and/or delivering online teaching learning activities. Out of 158, 133 (84.18%) faculties faced technical and internet issues. The most common advantage and disadvantage of online teaching learning activities perceived by the faculties were ‘not limited to time or place’ (94.30%) and ‘lack of interaction with the students’ (82.28%), respectively. Majority (94.3%) of them had a positive perception towards online teaching-learning activities conducted during COVID-19 pandemic. Slow internet connection (91.77%) and frequent electricity interruption (82.91%) were the most common perceived barriers to online teaching-learning. Conclusions: Most of the faculties had a positive perception towards online teaching-learning activities. Academic leaders and stakeholders should provide uninterrupted internet and electricity connectivity, training on online teaching-learning platform, and timely technical support.Keywords: COVID-19 pandemic, faculties, medical education, perception
Procedia PDF Downloads 1733606 Morphological Anatomical Study of the Axis Vertebra and Its Clinical Orientation
Authors: Mangala M. Pai, B. V. Murlimanju, Latha V. Prabhu, P. J. Jiji , Vandana Blossom
Abstract:
Background:To study the morphological parameters of the axis vertebra in anatomical specimens. Methods: The present study was designed to obtain the morphometric data of axis vertebra. The superior and inferior articular facets of the axis were macroscopically observed for their shapes and the different parameters were measured using the digital Vernier caliper. It included 20 dried axis bones, which were obtained from the anatomy laboratory. Results: The morphometric data obtained in the present study are represented in the tables. The side wise comparison of the length and width of the articular facets of the axis vertebra were done. The present study observed that, there is no statistically significant difference observed among the parameters of right and left side articular facets (p>0.05). The superior and inferior articular facets were observed to have variable shapes. The frequencies of different shapes of superior and inferior articular facets are represented in figures. All the shapes of the inferior and superior articular facets were symmetrical over the right and left sides. Among the superior articular facets, the constrictions were absent in 13 cases (65%), 2 (10%) exhibited a single constriction, 3 (15%) had 2 constrictions and 2 (10%) were having 3 constrictions. The constrictions were absent in 11 (55%) of the inferior articular facets, 3 (15%) of them had 1 constriction, 3 (15%) were having 2 constrictions, 2 (10%) exhibited 3 constrictions and 1 (5%) of them had 4 constrictions. The constrictions of the inferior and superior articular facets were symmetrical over the right and left sides. Conclusion: We believe that the present study has provided additional information on the morphometric data of the axis vertebra. The data are important to the neurosurgeons, orthopedic surgeons and radiologists. The preoperative assessment of the axis vertebra may prevent dangerous complications like spinal cord and nerve root compression during the surgical intervention.Keywords: axis, articular facet, morphology, morphometry
Procedia PDF Downloads 3283605 Understanding Mathematics Achievements among U. S. Middle School Students: A Bayesian Multilevel Modeling Analysis with Informative Priors
Authors: Jing Yuan, Hongwei Yang
Abstract:
This paper aims to understand U.S. middle school students’ mathematics achievements by examining relevant student and school-level predictors. Through a variance component analysis, the study first identifies evidence supporting the use of multilevel modeling. Then, a multilevel analysis is performed under Bayesian statistical inference where prior information is incorporated into the modeling process. During the analysis, independent variables are entered sequentially in the order of theoretical importance to create a hierarchy of models. By evaluating each model using Bayesian fit indices, a best-fit and most parsimonious model is selected where Bayesian statistical inference is performed for the purpose of result interpretation and discussion. The primary dataset for Bayesian modeling is derived from the Program for International Student Assessment (PISA) in 2012 with a secondary PISA dataset from 2003 analyzed under the traditional ordinary least squares method to provide the information needed to specify informative priors for a subset of the model parameters. The dependent variable is a composite measure of mathematics literacy, calculated from an exploratory factor analysis of all five PISA 2012 mathematics achievement plausible values for which multiple evidences are found supporting data unidimensionality. The independent variables include demographics variables and content-specific variables: mathematics efficacy, teacher-student ratio, proportion of girls in the school, etc. Finally, the entire analysis is performed using the MCMCpack and MCMCglmm packages in R.Keywords: Bayesian multilevel modeling, mathematics education, PISA, multilevel
Procedia PDF Downloads 3363604 Biophysical Consideration in the Interaction of Biological Cell Membranes with Virus Nanofilaments
Authors: Samaneh Farokhirad, Fatemeh Ahmadpoor
Abstract:
Biological membranes are constantly in contact with various filamentous soft nanostructures that either reside on their surface or are being transported between the cell and its environment. In particular, viral infections are determined by the interaction of viruses (such as filovirus) with cell membranes, membrane protein organization (such as cytoskeletal proteins and actin filament bundles) has been proposed to influence the mechanical properties of lipid membranes, and the adhesion of filamentous nanoparticles influence their delivery yield into target cells or tissues. The goal of this research is to integrate the rapidly increasing but still fragmented experimental observations on the adhesion and self-assembly of nanofilaments (including filoviruses, actin filaments, as well as natural and synthetic nanofilaments) on cell membranes into a general, rigorous, and unified knowledge framework. The global outbreak of the coronavirus disease in 2020, which has persisted for over three years, highlights the crucial role that nanofilamentbased delivery systems play in human health. This work will unravel the role of a unique property of all cell membranes, namely flexoelectricity, and the significance of nanofilaments’ flexibility in the adhesion and self-assembly of nanofilaments on cell membranes. This will be achieved utilizing a set of continuum mechanics, statistical mechanics, and molecular dynamics and Monte Carlo simulations. The findings will help address the societal needs to understand biophysical principles that govern the attachment of filoviruses and flexible nanofilaments onto the living cells and provide guidance on the development of nanofilament-based vaccines for a range of diseases, including infectious diseases and cancer.Keywords: virus nanofilaments, cell mechanics, computational biophysics, statistical mechanics
Procedia PDF Downloads 943603 The Impact of Behavioral Factors on the Decision Making of Real Estate Investor of Pakistan
Authors: Khalid Bashir, Hammad Zahid
Abstract:
Most of the investors consider that economic and financial information is the most important at the time of making investment decisions. But it is not true, as in the past two decades, the Behavioral aspects and the behavioral biases have gained an important place in the decision-making process of an investor. This study is basically conducted on this fact. The purpose of this study is to examine the impact of behavioral factors on the decision-making of the individual real estate investor in Pakistan. Some important behavioral factors like overconfidence, anchoring, gambler’s fallacy, home bias, loss aversion, regret aversion, mental accounting, herding and representativeness are used in this study to find their impact on the psychology of individual investors. The targeted population is the real estate investor of Pakistan, and a sample of 650 investors is selected on the basis of convenience sampling technique. The data is collected through the questionnaire with a response rate of 46.15 %. Descriptive statistical techniques and SEM are used to analyze the data by using statistical software. The results revealed the fact that some behavioral factors have a significant impact on the decision-making of investors. Among all the behavioral biases, overconfidence, anchoring, gambler’s fallacy, loss aversion and representativeness have a significant positive impact on the decision-making of the individual investor, while the rest of biases like home bias, regret aversion, mental accounting, herding have less impact on the decision-making process of an individual.Keywords: behavioral finance, anchoring, gambler’s fallacy, loss aversion
Procedia PDF Downloads 693602 An Epidemiological Analysis of the Occurrence of Bovine Brucellosis and Adopted Control Measures in South Africa during the Period 2014 to 2019
Authors: Emily Simango, T. Chitura
Abstract:
Background: Bovine brucellosis is among the most neglected zoonotic diseases in developing countries, where it is endemic and a growing challenge to public health. The development of cost-effective control measures for the disease can only be affirmed by the knowledge of the disease epidemiology and the ability to define its risk profiles. The aim of the study was to document the trend of bovine brucellosis and the control measures adopted following reported cases during the period 2014 to 2019 in South Africa. Methods: Data on confirmed cases of bovine brucellosis was retrieved from the website of the World Organisation of Animal Health (WOAH). Data was analysed using the Statistical Package for Social Sciences (IBM SPSS, 2022) version 29.0. Descriptive analysis (frequencies and percentages) and the Analysis of variance (ANOVA) were utilized for statistical significance (p<0.05). Results: The data retrieved in our study revealed an overall average bovine brucellosis prevalence of 8.48. There were statistically significant differences in bovine brucellosis prevalence across the provinces for the years 2016 and 2019 (p≥0.05), with the Eastern Cape Province having the highest prevalence in both instances. Documented control measures for the disease were limited to killing and disposal of disease cases as well as vaccination of susceptible animals. Conclusion: Bovine brucellosis is real in South Africa, with the risk profiles differing across the provinces. Information on brucellosis control measures in South Africa, as reported to the WOAH, is not comprehensive.Keywords: zoonotic, endemic, Eastern Cape province, vaccination
Procedia PDF Downloads 663601 Coordination of Traffic Signals on Arterial Streets in Duhok City
Authors: Dilshad Ali Mohammed, Ziyad Nayef Shamsulddin Aldoski, Millet Salim Mohammed
Abstract:
The increase in levels of traffic congestion along urban signalized arterials needs efficient traffic management. The application of traffic signal coordination can improve the traffic operation and safety for a series of signalized intersection along the arterials. The objective of this study is to evaluate the benefits achievable through actuated traffic signal coordination and make a comparison in control delay against the same signalized intersection in case of being isolated. To accomplish this purpose, a series of eight signalized intersections located on two major arterials in Duhok City was chosen for conducting the study. Traffic data (traffic volumes, link and approach speeds, and passenger car equivalent) were collected at peak hours. Various methods had been used for collecting data such as video recording technique, moving vehicle method and manual methods. Geometric and signalization data were also collected for the purpose of the study. The coupling index had been calculated to check the coordination attainability, and then time space diagrams were constructed representing one-way coordination for the intersections on Barzani and Zakho Streets, and others represented two-way coordination for the intersections on Zakho Street with accepted progression bandwidth efficiency. The results of this study show great progression bandwidth of 54 seconds for east direction coordination and 17 seconds for west direction coordination on Barzani Street under suggested controlled speed of 60 kph agreeable with the present data. For Zakho Street, the progression bandwidth is 19 seconds for east direction coordination and 18 seconds for west direction coordination under suggested controlled speed of 40 kph. The results show that traffic signal coordination had led to high reduction in intersection control delays on both arterials.Keywords: bandwidth, congestion, coordination, traffic, signals, streets
Procedia PDF Downloads 3073600 Weibull Cumulative Distribution Function Analysis with Life Expectancy Endurance Test Result of Power Window Switch
Authors: Miky Lee, K. Kim, D. Lim, D. Cho
Abstract:
This paper presents the planning, rationale for test specification derivation, sampling requirements, test facilities, and result analysis used to conduct lifetime expectancy endurance tests on power window switches (PWS) considering thermally induced mechanical stress under diurnal cyclic temperatures during normal operation (power cycling). The detail process of analysis and test results on the selected PWS set were discussed in this paper. A statistical approach to ‘life time expectancy’ was given to the measurement standards dealing with PWS lifetime determination through endurance tests. The approach choice, within the framework of the task, was explained. The present task was dedicated to voltage drop measurement to derive lifetime expectancy while others mostly consider contact or surface resistance. The measurements to perform and the main instruments to measure were fully described accordingly. The failure data from tests were analyzed to conclude lifetime expectancy through statistical method using Weibull cumulative distribution function. The first goal of this task is to develop realistic worst case lifetime endurance test specification because existing large number of switch test standards cannot induce degradation mechanism which makes the switches less reliable. 2nd goal is to assess quantitative reliability status of PWS currently manufactured based on test specification newly developed thru this project. The last and most important goal is to satisfy customer’ requirement regarding product reliability.Keywords: power window switch, endurance test, Weibull function, reliability, degradation mechanism
Procedia PDF Downloads 2353599 Effectiveness of Traditional Chinese Medicine in the Treatment of Eczema: A Systematic Review and Meta-Analysis Based on Eczema Area and Severity Index Score
Authors: Oliver Chunho Ma, Tszying Chang
Abstract:
Background: Traditional Chinese Medicine (TCM) has been widely used in the treatment of eczema. However, there is currently a lack of comprehensive research on the overall effectiveness of TCM in treating eczema, particularly using the Eczema Area and Severity Index (EASI) score as an evaluation tool. Meta-analysis can integrate the results of multiple studies to provide more convincing evidence. Objective: To conduct a systematic review and meta-analysis based on the EASI score to evaluate the overall effectiveness of TCM in the treatment of eczema. Specifically, the study will review and analyze published clinical studies that investigate TCM treatments for eczema and use the EASI score as an outcome measure, comparing the differences in improving the severity of eczema between TCM and other treatment modalities, such as conventional Western medicine treatments. Methods: Relevant studies, including randomized controlled trials (RCTs) and non-randomized controlled trials, that involve TCM treatment for eczema and use the EASI score as an outcome measure will be searched in medical literature databases such as PubMed, CNKI, etc. Relevant data will be extracted from the selected studies, including study design, sample size, treatment methods, improvement in EASI score, etc. The methodological quality and risk of bias of the included studies will be assessed using appropriate evaluation tools (such as the Cochrane Handbook). The results of the selected studies will be statistically analyzed, including pooling effect sizes (such as standardized mean differences, relative risks, etc.), subgroup analysis (e.g., different TCM syndromes, different treatment modalities), and sensitivity analysis (e.g., excluding low-quality studies). Based on the results of the statistical analysis and quality assessment, the overall effectiveness of TCM in improving the severity of eczema will be interpreted. Expected outcomes: By integrating the results of multiple studies, we expect to provide more convincing evidence regarding the specific effects of TCM in improving the severity of eczema. Additionally, subgroup analysis and sensitivity analysis can further elucidate whether the effectiveness of TCM treatment is influenced by different factors. Besides, we will compare the results of the meta-analysis with the clinical data from our clinic. For both the clinical data and the meta-analysis results, we will perform descriptive statistics such as means, standard deviations, percentages, etc. and compare the differences between the two using statistical tests such as independent samples t-test or non-parametric tests to assess the statistical differences between them.Keywords: Eczema, traditional Chinese medicine, EASI, systematic review, meta-analysis
Procedia PDF Downloads 583598 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 2023597 Contribution of Foraminifers in Biostratigraphy and Paleoecology Interpretations of the Basal Eocene from the Phosphatic Sra Ouertaine Basin, in the Southern Tethys(Tunisia)
Authors: Oum Elkhir Mahmoudi, Nebiha Ben Haj Ali
Abstract:
Micropaleontological, sedimentological and statistical studies were carried out on the late Paleocene-early Eocene succession of Sra Ouertaine and Dyr El Kef in Northern open phosphatic Basin of Tunisia. Based on the abundance and stratigraphic distribution of planktic foraminiferal species, five planktic zones have been recognized from the base to the top of the phosphatic layers. The El Acarinina sibaiyaensis Zone, the E2 Pseudohastigerina wilcoxensis Zone, the E3 Morozovella marginodentata Zone, the E4 Morozovella formosa Zones and the E5 Morozovella subbotinae Zone. The placement of Paleocene-Eocene boundary (PETM) is just below the base of the phosphatic interval. The ETM-2 event may be detectable in the analyzed biotic record of Sra Ouertaine. Based on benthic assemblages, abundances, cluster and multivariate statistical analyses, two biofacies were recognized for each section. The recognized ecozones are typical of warm and shallow water inner neritic setting (dominance of epifaunal fauna Anomalinoides, Dentalina and Cibicidoides associated with Frondicularia phosphatica, Trochamminoides globigeriniformis and Eponides elevatus). The paleoenvironment is eutrophic (presence of several bolivinitids and verneuilinids). For the Dyr El Kef section and P5 and E2 of Sra Ouertaine section, our records indicate that paleoenvironment is influenced by coastal upwelling without oxygen-deficiency, the paleodepth is estimated to be around 50 m. The paleoecosystem is diversified and balanced with a general tendency to stressed condition. While the upper part of Sra Ouertaine section is more eutrophic, influenced by coastal upwelling with oxygen-deficiency, the paleodepth is estimated to be less than 50 m and the ecosystem is unsettled.Keywords: Tunisia, Sra ouertaine Dyr el kef, early Eocene, foraminifera, chronostratigraphy, paleoecology, paleoenvironment
Procedia PDF Downloads 473596 Studying the Effects of Job Training on Employees Efficiency: A Case Study of University Employees, Qom, Iran
Authors: Seyfollah Fazlollahi, Ahmad Bayan Memar
Abstract:
Background: A review of manpower planning includes a training analysis based on job descriptions and job specifications which looks carefully at training from the points of view of the company, its various departments and personnel. This may show weaknesses in some departments and as a result, training is needed for the staff. Purpose: The aim of this research is to investigate the effects of training on employee’s efficiency in different aspects of work. Methodology: This is a descriptive-survey study. Statistical population was 85 official employees of University of Qom, Iran. 70 of these individuals were selected on a statistical random sampling method using Morgan&Gorki table. The instrument used in this study was a questionnaire including 22 questions. Result: Findings in this study according to data analysis indicate that majority of respondents had positive attitude towards training programs, in the job or off the job. They believed that training programs promoted and enhanced their behavior positively which leads to high efficiency in their job. In fact, data support the main hypothesis that training has positive effects on job performance and efficiency. Conclusion: It is concluded from this study and other related researches that training (on the job and off the job) has positive and effective role in human development and labor as employee’s efficiency. Employees get acquainted with different tasks of a job. Group co-operation, creativity and innovation will be enforced. Training leads to job skills, increasing knowledge and information about a job. It also increases technical and conceptual human skills, which are important in an organization. We can also mention workers' increasing positive motivation toward their job, enforcement of coordinating moral, their good human relations and good contact with clients.Keywords: training, work efficiency, employee, human relation, job satisfaction
Procedia PDF Downloads 2013595 Assessment of Groundwater Chemistry and Quality Characteristics in an Alluvial Aquifer and a Single Plane Fractured-Rock Aquifer in Bloemfontein, South Africa
Authors: Modreck Gomo
Abstract:
The evolution of groundwater chemistry and its quality is largely controlled by hydrogeochemical processes and their understanding is therefore important for groundwater quality assessments and protection of the water resources. A study was conducted in Bloemfontein town of South Africa to assess and compare the groundwater chemistry and quality characteristics in an alluvial aquifer and single-plane fractured-rock aquifers. 9 groundwater samples were collected from monitoring boreholes drilled into the two aquifer systems during a once-off sampling exercise. Samples were collected through low-flow purging technique and analysed for major ions and trace elements. In order to describe the hydrochemical facies and identify dominant hydrogeochemical processes, the groundwater chemistry data are interpreted using stiff diagrams and principal component analysis (PCA), as complimentary tools. The fitness of the groundwater quality for domestic and irrigation uses is also assessed. Results show that the alluvial aquifer is characterised by a Na-HCO3 hydrochemical facie while fractured-rock aquifer has a Ca-HCO3 facie. The groundwater in both aquifers originally evolved from the dissolution of calcite rocks that are common on land surface environments. However the groundwater in the alluvial aquifer further goes through another evolution as driven by cation exchange process in which Na in the sediments exchanges with Ca2+ in the Ca-HCO3 hydrochemical type to result in the Na-HCO3 hydrochemical type. Despite the difference in the hydrogeochemical processes between the alluvial aquifer and single-plane fractured-rock aquifer, this did not influence the groundwater quality. The groundwater in the two aquifers is very hard as influenced by the elevated magnesium and calcium ions that evolve from dissolution of carbonate minerals which typically occurs in surface environments. Based on total dissolved levels (600-900 mg/L), groundwater quality of the two aquifer systems is classified to be of fair quality. The negative potential impacts of the groundwater quality for domestic uses are highlighted.Keywords: alluvial aquifer, fractured-rock aquifer, groundwater quality, hydrogeochemical processes
Procedia PDF Downloads 2043594 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 293593 Studying the Establishment of Knowledge Management Background Factors at Islamic Azad University, Behshahr Branch
Authors: Mohammad Reza Bagherzadeh, Mohammad Hossein Taheri
Abstract:
Knowledge management serves as one of the great breakthroughs in information and knowledge era and given its outstanding features, successful organizations tends to adopt it. Therefore, to deal with knowledge management establishment in universities is of special importance. In this regard, the present research aims to shed lights on factors background knowledge management establishment at Islamic Azad University, Behshahr Branch (Northern Iran). Considering three factors information technology system, knowledge process system and organizational culture as a fundamental of knowledge management infrastructure, foregoing factors were evaluated individually. The present research was conducted in descriptive-survey manner and participants included all staffs and faculty members, so that according to Krejcie & Morgan table a sample size proportional to the population size was considered. The measurement tools included survey questionnaire whose reliability was calculated to 0.83 according to Cronbachs alpha. To data analysis, descriptive statistics such as frequency and its percentage tables, column charts, mean, standard deviation and as for inferential statistics Kolomogrov- Smirnov test and single T-test were used. The findings show that despite the good corporate culture as one of the three factors background the establishment of the knowledge management at Islamic Azad University Behshahr Branch, other two ones, including IT systems, and knowledge processes systems are characterized with adverse status. As a result, these factors have caused no necessary conditions for the establishment of Knowledge Management in the university provided.Keywords: knowledge management, information technology, knowledge processes, organizational culture, educational institutions
Procedia PDF Downloads 5203592 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation
Authors: H. Khanfari, M. Johari Fard
Abstract:
Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.Keywords: carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L)
Procedia PDF Downloads 2203591 The Design of Multiple Detection Parallel Combined Spread Spectrum Communication System
Authors: Lixin Tian, Wei Xue
Abstract:
Many jobs in society go underground, such as mine mining, tunnel construction and subways, which are vital to the development of society. Once accidents occur in these places, the interruption of traditional wired communication is not conducive to the development of rescue work. In order to realize the positioning, early warning and command functions of underground personnel and improve rescue efficiency, it is necessary to develop and design an emergency ground communication system. It is easy to be subjected to narrowband interference when performing conventional underground communication. Spreading communication can be used for this problem. However, general spread spectrum methods such as direct spread communication are inefficient, so it is proposed to use parallel combined spread spectrum (PCSS) communication to improve efficiency. The PCSS communication not only has the anti-interference ability and the good concealment of the traditional spread spectrum system, but also has a relatively high frequency band utilization rate and a strong information transmission capability. So, this technology has been widely used in practice. This paper presents a PCSS communication model-multiple detection parallel combined spread spectrum (MDPCSS) communication system. In this paper, the principle of MDPCSS communication system is described, that is, the sequence at the transmitting end is processed in blocks and cyclically shifted to facilitate multiple detection at the receiving end. The block diagrams of the transmitter and receiver of the MDPCSS communication system are introduced. At the same time, the calculation formula of the system bit error rate (BER) is introduced, and the simulation and analysis of the BER of the system are completed. By comparing with the common parallel PCSS communication, we can draw a conclusion that it is indeed possible to reduce the BER and improve the system performance. Furthermore, the influence of different pseudo-code lengths selected on the system BER is simulated and analyzed, and the conclusion is that the larger the pseudo-code length is, the smaller the system error rate is.Keywords: cyclic shift, multiple detection, parallel combined spread spectrum, PN code
Procedia PDF Downloads 1373590 Effectiveness of Acceptance and Commitment Therapy on Reducing Corona Disease Anxiety in the Staff Working in Shahid Beheshti Hospital of Shiraz
Authors: Gholam Reza Mirzaei
Abstract:
This research aimed to investigate the effectiveness of acceptance and commitment therapy (ACT) in reducing corona disease anxiety in the staff working at Shahid Beheshti Hospital of Shiraz. The current research was a quasi-experimental study having pre-test and post-test with two experimental and control groups. The statistical population of the research included all the staff of Shahid Beheshti Hospital of Shiraz in 2021. From among the statistical population, 30 participants (N =15 in the experimental group and N =15 in the control group) were selected by available sampling. The materials used in the study comprised the Cognitive Emotion Regulation Questionnaire (CERQ) and Corona Disease Anxiety Scale (CDAS). Following data collection, the participants’ scores were analyzed using SPSS 20 at both descriptive (mean and standard deviation) and inferential (analysis of covariance) levels. The results of the analysis of covariance (ANCOVA) showed that acceptance and commitment therapy (ACT) is effective in reducing Corona disease anxiety (mental and physical symptoms) in the staff working at Shahid Beheshti Hospital of Shiraz. The effectiveness of acceptance and commitment therapy (ACT) on reducing mental symptoms was 25.5% and on physical symptoms was 13.8%. The mean scores of the experimental group in the sub-scales of Corona disease anxiety (mental and physical symptoms) in the post-test were lower than the mean scores of the control group.Keywords: acceptance and commitment therapy, corona disease anxiety, hospital staff, Shiraz
Procedia PDF Downloads 403589 Trading off Accuracy for Speed in Powerdrill
Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica
Abstract:
In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries
Procedia PDF Downloads 2593588 Body Composition Analyser Parameters and Their Comparison with Manual Measurements
Authors: I. Karagjozova, B. Dejanova, J. Pluncevic, S. Petrovska, V. Antevska, L. Todorovska
Abstract:
Introduction: Medical checking assessment is important in sports medicine. To follow the health condition in subjects who perform sports, body composition parameters, such as intracellular water, extracellular water, protein and mineral content, muscle and fat mass might be useful. The aim of the study was to show available parameters and to compare them to manual assessment. Material and methods: A number of 20 subjects (14 male and 6 female) at age of 20±2 years were determined in the study, 5 performed recreational sports, while others were professional ones. The mean height was 175±7 cm, the mean weight was 72±9 cm, and the body mass index (BMI) was 23±2 kg/m2. The measured compartments were as following: intracellular water (IW), extracellular water (EW), protein component (PC), mineral component (MC), skeletal muscle mass (SMM) and body fat mass (BFM). Lean balance were examined for right and left arm (LA), trunk (T), right leg (RL) and left leg (LL). The comparison was made between the calculation derived by manual made measurements, using Matejka formula and parameters obtained by body composition analyzer (BCA) - Inbody 720 BCA Biospace. Used parameters for the comparison were muscle mass (SMM), body fat mass (BFM). Results: BCA obtained values were for: IW - 22.6±5L, EW - 13.5±2 L, PC - 9.8±0.9 kg, MC - 3.5±0.3, SMM - 27±3 kg, BFM - 13.8±4 kg. Lean balance showed following values for: RA - 2.45±0.2 kg, LA - 2.37±0.4, T - 20.9±5 kg, RL - 7.43±1 kg, and LL - 7.49 ±1.5 kg. SMM showed statistical difference between manual obtained value, 51±01% to BCA parameter 45.5±3% (p<0.001). Manual obtained values for BFM was lower (17±2%) than BCA obtained one, 19.5±5.9% (p<0.02). Discussion: The obtained results showed appropriate values for the examined age, regarding to all examined parameters which contribute to overview the body compartments, important for sport performing. Due to comparison between the manual and BCA assessment, we may conclude that manual measurements may differ from the certain ones, which is confirmed by statistical significance.Keywords: athletes, body composition, bio electrical impedance, sports medicine
Procedia PDF Downloads 4773587 Self-Efficacy Perceptions of Pre-Service Art and Music Teachers towards the Use of Information and Communication Technologies
Authors: Agah Tugrul Korucu
Abstract:
Information and communication technologies have become an important part of our daily lives with significant investments in technology in the 21st century. Individuals are more willing to design and implement computer-related activities, and they are the main component of computer self-efficacy and self-efficacy related to the fact that the increase in information technology, with operations in parallel with these activities more successful. The Self-efficacy level is a significant factor which determines how individuals act in events, situations and difficult processes. It is observed that individuals with higher self-efficacy perception of computers who encounter problems related to computer use overcome them more easily. Therefore, this study aimed to examine self-efficacy perceptions of pre-service art and music teachers towards the use of information and communication technologies in terms of different variables. Research group consists of 60 pre-service teachers who are studying at Necmettin Erbakan University Ahmet Keleşoğlu Faculty of Education Art and Music department. As data collection tool of the study; “personal information form” developed by the researcher and used to collect demographic data and "the perception scale related to self-efficacy of informational technology" are used. The scale is 5-point Likert-type scale. It consists of 27 items. The Kaiser-Meyer-Olkin (KMO) sample compliance value is found 0.959. The Cronbach alpha reliability coefficient of the scale is found to be 0.97. computer-based statistical software package (SPSS 21.0) is used in order to analyze the data collected by data collection tools; descriptive statistics, t-test, analysis of variance are used as statistical techniques.Keywords: self-efficacy perceptions, teacher candidate, information and communication technologies, art teacher
Procedia PDF Downloads 3263586 Simulation, Optimization, and Analysis Approach of Microgrid Systems
Authors: Saqib Ali
Abstract:
Sources are classified into two depending upon the factor of reviving. These sources, which cannot be revived into their original shape once they are consumed, are considered as nonrenewable energy resources, i.e., (coal, fuel) Moreover, those energy resources which are revivable to the original condition even after being consumed are known as renewable energy resources, i.e., (wind, solar, hydel) Renewable energy is a cost-effective way to generate clean and green electrical energy Now a day’s majority of the countries are paying heed to energy generation from RES Pakistan is mostly relying on conventional energy resources which are mostly nonrenewable in nature coal, fuel is one of the major resources, and with the advent of time their prices are increasing on the other hand RES have great potential in the country with the deployment of RES greater reliability and an effective power system can be obtained In this thesis, a similar concept is being used and a hybrid power system is proposed which is composed of intermixing of renewable and nonrenewable sources The Source side is composed of solar, wind, fuel cells which will be used in an optimal manner to serve load The goal is to provide an economical, reliable, uninterruptable power supply. This is achieved by optimal controller (PI, PD, PID, FOPID) Optimization techniques are applied to the controllers to achieve the desired results. Advanced algorithms (Particle swarm optimization, Flower Pollination Algorithm) will be used to extract the desired output from the controller Detailed comparison in the form of tables and results will be provided, which will highlight the efficiency of the proposed system.Keywords: distributed generation, demand-side management, hybrid power system, micro grid, renewable energy resources, supply-side management
Procedia PDF Downloads 973585 Evaluating the Factors Controlling the Hydrochemistry of Gaza Coastal Aquifer Using Hydrochemical and Multivariate Statistical Analysis
Authors: Madhat Abu Al-Naeem, Ismail Yusoff, Ng Tham Fatt, Yatimah Alias
Abstract:
Groundwater in Gaza strip is increasingly being exposed to anthropic and natural factors that seriously impacted the groundwater quality. Physiochemical data of groundwater can offer important information on changes in groundwater quality that can be useful in improving water management tactics. An integrative hydrochemical and statistical techniques (Hierarchical cluster analysis (HCA) and factor analysis (FA)) have been applied on the existence ten physiochemical data of 84 samples collected in (2000/2001) using STATA, AquaChem, and Surfer softwares to: 1) Provide valuable insight into the salinization sources and the hydrochemical processes controlling the chemistry of groundwater. 2) Differentiate the influence of natural processes and man-made activities. The recorded large diversity in water facies with dominance Na-Cl type that reveals a highly saline aquifer impacted by multiple complex hydrochemical processes. Based on WHO standards, only (15.5%) of the wells were suitable for drinking. HCA yielded three clusters. Cluster 1 is the highest in salinity, mainly due to the impact of Eocene saline water invasion mixed with human inputs. Cluster 2 is the lowest in salinity also due to Eocene saline water invasion but mixed with recent rainfall recharge and limited carbonate dissolution and nitrate pollution. Cluster 3 is similar in salinity to Cluster 2, but with a high diversity of facies due to the impact of many sources of salinity as sea water invasion, carbonate dissolution and human inputs. Factor analysis yielded two factors accounting for 88% of the total variance. Factor 1 (59%) is a salinization factor demonstrating the mixing contribution of natural saline water with human inputs. Factor 2 measure the hardness and pollution which explained 29% of the total variance. The negative relationship between the NO3- and pH may reveal a denitrification process in a heavy polluted aquifer recharged by a limited oxygenated rainfall. Multivariate statistical analysis combined with hydrochemical analysis indicate that the main factors controlling groundwater chemistry were Eocene saline invasion, seawater invasion, sewage invasion and rainfall recharge and the main hydrochemical processes were base ion and reverse ion exchange processes with clay minerals (water rock interactions), nitrification, carbonate dissolution and a limited denitrification process.Keywords: dendrogram and cluster analysis, water facies, Eocene saline invasion and sea water invasion, nitrification and denitrification
Procedia PDF Downloads 3653584 Effect of 3-Dimensional Knitted Spacer Fabrics Characteristics on Its Thermal and Compression Properties
Authors: Veerakumar Arumugam, Rajesh Mishra, Jiri Militky, Jana Salacova
Abstract:
The thermo-physiological comfort and compression properties of knitted spacer fabrics have been evaluated by varying the different spacer fabric parameters. Air permeability and water vapor transmission of the fabrics were measured using the Textest FX-3300 air permeability tester and PERMETEST. Then thermal behavior of fabrics was obtained by Thermal conductivity analyzer and overall moisture management capacity was evaluated by moisture management tester. Spacer Fabrics compression properties were also tested using Kawabata Evaluation System (KES-FB3). In the KES testing, the compression resilience, work of compression, linearity of compression and other parameters were calculated from the pressure-thickness curves. Analysis of Variance (ANOVA) was performed using new statistical software named QC expert trilobite and Darwin in order to compare the influence of different fabric parameters on thermo-physiological and compression behavior of samples. This study established that the raw materials, type of spacer yarn, density, thickness and tightness of surface layer have significant influence on both thermal conductivity and work of compression in spacer fabrics. The parameter which mainly influence on the water vapor permeability of these fabrics is the properties of raw material i.e. the wetting and wicking properties of fibers. The Pearson correlation between moisture capacity of the fabrics and water vapour permeability was found using statistical software named QC expert trilobite and Darwin. These findings are important requirements for the further designing of clothing for extreme environmental conditions.Keywords: 3D spacer fabrics, thermal conductivity, moisture management, work of compression (WC), resilience of compression (RC)
Procedia PDF Downloads 5423583 Prevalence of Human Papillomavirus in Squamous Intraepithelial Lesions and Cervical Cancer in Women of the North of Chihuahua, Mexico
Authors: Estefania Ponce-Amaya, Ana Lidia Arellano-Ortiz, Cecilia Diaz-Hernandez, Jose Alberto Lopez-Diaz, Antonio De La Mora-Covarrubias, Claudia Lucia Vargas-Requena, Mauricio Salcedo-Vargas, Florinda Jimenez-Vega
Abstract:
Cervical Cancer (CC) is the second leading cause of death among women worldwide and it had been associated with a persistent infection of human papillomavirus (HPV). The goal of the current study was to identify the prevalence of HPV infection in women with abnormal Pap smear who were attended at Dysplasia Clinic of Ciudad Juarez, Mexico. Methods: Cervical samples from 146 patients, who attended the Colposcopy Clinic at Sanitary Jurisdiction II of Cd Juarez, were collected for histopathology and molecular study. DNA was isolated for the HPV detection by Polymerase Chain Reaction (PCR) using MY09/011 and GP5/6 primers. The associated risk factors were assessed by a questionnaire. The statistical analysis was performed by ANOVA, using EpiINFO V7 software. Results: HPV infection was present in 142 patients (97.3 %). The prevalence of HPV infection was distributed in a 96% of all evaluated groups, low-grade squamous intraepithelial lesion (LSIL), high-grade squamous intraepithelial lesion (HISIL) and CC. We found a statistical significance (α = <0.05) between gestation and number of births as risk factors. The median values showed an ascending tend according with the lesion progression. However, CC showed a statistically significant difference with respect to the pre-carcinogenic stages. Conclusions: In these Mexican patients exists a high prevalence of HPV infection, and for that reason, we are studying the most prevalent HPV genotypes in this population.Keywords: cervical cancer, HPV, prevalence hpv, squamous intraepithelial lesion
Procedia PDF Downloads 3203582 Recycling of Spent Mo-Co Catalyst for the Recovery of Molybdenum Using Cyphos IL 104
Authors: Harshit Mahandra, Rashmi Singh, Bina Gupta
Abstract:
Molybdenum is widely used in thermocouples, anticathode of X-ray tubes and in the production of alloys of steels. Molybdenum compounds are extensively used as a catalyst in petroleum-refining industries for hydrodesulphurization. Activity of the catalysts decreases gradually with time and are dumped as hazardous waste due to contamination with toxic materials during the process. These spent catalysts can serve as a secondary source for metal recovery and help to sort out environmental and economical issues. In present study, extraction and separation of molybdenum from a Mo-Co spent catalyst leach liquor containing 0.870 g L⁻¹ Mo, 0.341 g L⁻¹ Co, 0.422 ×10⁻¹ g L⁻¹ Fe and 0.508 g L⁻¹ Al in 3 mol L⁻¹ HCl has been investigated using solvent extraction technique. The extracted molybdenum has been finally recovered as molybdenum trioxide. Leaching conditions used were- 3 mol L⁻¹ HCl, 90°C temperature, solid to liquid ratio (w/v) of 1.25% and reaction time of 60 minutes. 96.45% molybdenum was leached under these conditions. For the extraction of molybdenum from leach liquor, Cyphos IL 104 [trihexyl(tetradecyl)phosphonium bis(2,4,4-trimethylpentyl)phosphinate] in toluene was used as an extractant. Around 91% molybdenum was extracted with 0.02 mol L⁻¹ Cyphos IL 104, and 75% of molybdenum was stripped from the loaded organic phase with 2 mol L⁻¹ HNO₃ at A/O=1/1. McCabe Thiele diagrams were drawn to determine the number of stages required for the extraction and stripping of molybdenum. According to McCabe Thiele plots, two stages are required for both extraction and stripping of molybdenum at A/O=1/1 which were also confirmed by countercurrent simulation studies. Around 98% molybdenum was extracted in two countercurrent extraction stages with no co-extraction of cobalt and aluminum. Iron was removed from the loaded organic phase by scrubbing with 0.01 mol L⁻¹ HCl. Quantitative recovery of molybdenum is achieved in three countercurrent stripping stages at A/O=1/1. Trioxide of molybdenum was obtained from strip solution and was characterized by XRD, FE-SEM and EDX techniques. Molybdenum trioxide due to its distinctive electrochromic, thermochromic and photochromic properties is used as a smart material for sensors, lubricants, and Li-ion batteries. Molybdenum trioxide finds application in various processes such as methanol oxidation, metathesis, propane oxidation and in hydrodesulphurization. It can also be used as a precursor for the synthesis of MoS₂ and MoSe₂.Keywords: Cyphos IL 104, molybdenum, spent Mo-Co catalyst, recovery
Procedia PDF Downloads 2063581 Improving Security Features of Traditional Automated Teller Machines-Based Banking Services via Fingerprint Biometrics Scheme
Authors: Anthony I. Otuonye, Juliet N. Odii, Perpetual N. Ibe
Abstract:
The obvious challenges faced by most commercial bank customers while using the services of ATMs (Automated Teller Machines) across developing countries have triggered the need for an improved system with better security features. Current ATM systems are password-based, and research has proved the vulnerabilities of these systems to heinous attacks and manipulations. We have discovered by research that the security of current ATM-assisted banking services in most developing countries of the world is easily broken and maneuvered by fraudsters, majorly because it is quite difficult for these systems to identify an impostor with privileged access as against the authentic bank account owner. Again, PIN (Personal Identification Number) code passwords are easily guessed, just to mention a few of such obvious limitations of traditional ATM operations. In this research work also, we have developed a system of fingerprint biometrics with PIN code Authentication that seeks to improve the security features of traditional ATM installations as well as other Banking Services. The aim is to ensure better security at all ATM installations and raise the confidence of bank customers. It is hoped that our system will overcome most of the challenges of the current password-based ATM operation if properly applied. The researchers made use of the OOADM (Object-Oriented Analysis and Design Methodology), a software development methodology that assures proper system design using modern design diagrams. Implementation and coding were carried out using Visual Studio 2010 together with other software tools. Results obtained show a working system that provides two levels of security at the client’s side using a fingerprint biometric scheme combined with the existing 4-digit PIN code to guarantee the confidence of bank customers across developing countries.Keywords: fingerprint biometrics, banking operations, verification, ATMs, PIN code
Procedia PDF Downloads 423580 Production and Distribution Network Planning Optimization: A Case Study of Large Cement Company
Authors: Lokendra Kumar Devangan, Ajay Mishra
Abstract:
This paper describes the implementation of a large-scale SAS/OR model with significant pre-processing, scenario analysis, and post-processing work done using SAS. A large cement manufacturer with ten geographically distributed manufacturing plants for two variants of cement, around 400 warehouses serving as transshipment points, and several thousand distributor locations generating demand needed to optimize this multi-echelon, multi-modal transport supply chain separately for planning and allocation purposes. For monthly planning as well as daily allocation, the demand is deterministic. Rail and road networks connect any two points in this supply chain, creating tens of thousands of such connections. Constraints include the plant’s production capacity, transportation capacity, and rail wagon batch size constraints. Each demand point has a minimum and maximum for shipments received. Price varies at demand locations due to local factors. A large mixed integer programming model built using proc OPTMODEL decides production at plants, demand fulfilled at each location, and the shipment route to demand locations to maximize the profit contribution. Using base SAS, we did significant pre-processing of data and created inputs for the optimization. Using outputs generated by OPTMODEL and other processing completed using base SAS, we generated several reports that went into their enterprise system and created tables for easy consumption of the optimization results by operations.Keywords: production planning, mixed integer optimization, network model, network optimization
Procedia PDF Downloads 673579 Weakly Solving Kalah Game Using Artificial Intelligence and Game Theory
Authors: Hiba El Assibi
Abstract:
This study aims to weakly solve Kalah, a two-player board game, by developing a start-to-finish winning strategy using an optimized Minimax algorithm with Alpha-Beta Pruning. In weakly solving Kalah, our focus is on creating an optimal strategy from the game's beginning rather than analyzing every possible position. The project will explore additional enhancements like symmetry checking and code optimizations to speed up the decision-making process. This approach is expected to give insights into efficient strategy formulation in board games and potentially help create games with a fair distribution of outcomes. Furthermore, this research provides a unique perspective on human versus Artificial Intelligence decision-making in strategic games. By comparing the AI-generated optimal moves with human choices, we can explore how seemingly advantageous moves can, in the long run, be harmful, thereby offering a deeper understanding of strategic thinking and foresight in games. Moreover, this paper discusses the evaluation of our strategy against existing methods, providing insights on performance and computational efficiency. We also discuss the scalability of our approach to the game, considering different board sizes (number of pits and stones) and rules (different variations) and studying how that affects performance and complexity. The findings have potential implications for the development of AI applications in strategic game planning, enhancing our understanding of human cognitive processes in game settings, and offer insights into creating balanced and engaging game experiences.Keywords: minimax, alpha beta pruning, transposition tables, weakly solving, game theory
Procedia PDF Downloads 55