Search results for: Statistical Approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17293

Search results for: Statistical Approach

15793 Integrated Imaging Management System: An Approach in the Collaborative Coastal Resource Management of Bagac, Bataan

Authors: Aljon Pangan

Abstract:

The Philippines being an archipelagic country, is surrounded by coastlines (36,289 km), coastal waters (226,000 km²), oceanic waters (1.93 million km²) and territorial waters (2.2 million km²). Studies show that the Philippine coastal ecosystems are the most productive and biologically diverse in the world, however, plagued by degradation problems due to over-exploitation and illegal activities. The existence of coastal degradation issues in the country led to the emergence of Coastal Resource Management (CRM) as an approach to both national and local government in providing solutions for sustainable coastal resource utilization. CRM applies the idea of planning, implementing and monitoring through the lens of collaborative governance. It utilizes collective action and decision-making to achieve sustainable use of coastal resources. The Municipality of Bagac in Bataan is one of the coastal municipalities in the country who crafts its own CRM Program as a solution to coastal resource degradation and problems. Information and Communications Technology (ICT), particularly Integrated Imaging Management System (IIMS) is one approach that can be applied in the formula of collaborative governance which entails the Government, Private Sector, and Civil Society. IIMS can help policymakers, managers, and citizens in managing coastal resources through analyzed spatial data describing the physical, biological, and socioeconomic characteristics of the coastal areas. Moreover, this study will apply the qualitative approach in deciphering possible impacts of the application of IIMS in the Coastal Resource Management policy making and implementation of the Municipality of Bagac.

Keywords: coastal resource management, collaborative governance, integrated imaging management system, information and communication technology

Procedia PDF Downloads 400
15792 “Ethiopian Approach” to Combating Desertification: The Case of Semi-Arid Savanna Grasslands in Southern Ethiopia

Authors: Wang Yongdong, Yeneayehu Fenetahun, You Yuan, Ogbue Chukwuka, Yahaya Ibrahim, Xu Xinwen

Abstract:

This paper explores an innovative Ethiopian approach to combatting desertification, focusing on the semi-arid savanna grasslands in Southern Ethiopia. The study investigates the multifaceted strategies employed by Ethiopian communities, governmental bodies, and non-governmental organizations to address desertification challenges in the region. Through an analysis of legislative frameworks, community engagement, afforestation programs, and sustainable land management techniques, this research highlights the efficacy of Ethiopia's strategy in reducing the effects of desertification. The results emphasize how crucial it is to build effective measures for halting desertification in fragile ecosystems by utilizing local knowledge, community involvement, and adaptive governance. In addition, this study also addresses how the Ethiopian approach may be applied to other areas with comparable environmental problems. In summary, this research adds significant perspectives to the worldwide conversation about desertification and provides useful guidance for sustainable land use.

Keywords: adaptive governance, community engagement, desertification, policy frameworks

Procedia PDF Downloads 44
15791 Analytical Approach to Study the Uncertainties Related to the Behavior of Structures Submitted to Differential Settlement

Authors: Elio El Kahi, Michel Khouri, Olivier Deck, Pierre Rahme, Rasool Mehdizadeh

Abstract:

Recent developments in civil engineering create multiple interaction problems between the soil and the structure. One of the major problems is the impact of ground movements on buildings. Consequently, managing risks associated with these movements, requires a determination of the different influencing factors and a specific knowledge of their variability/uncertainty. The main purpose of this research is to study the behavior of structures submitted to differential settlement, in order to assess their vulnerability, taking into consideration the different sources of uncertainties. Analytical approach is applied to investigate on one hand the influence of these uncertainties that are related to the soil, and on the other hand the structure stiffness variation with the presence of openings and the movement transmitted between them as related to the origin and shape of the free-field movement. Results reveal the effect of taking these uncertainties into consideration, and specify the dominant and most significant parameters that control the ground movement associated with the Soil-Structure Interaction (SSI) phenomenon.

Keywords: analytical approach, building, damage, differential settlement, soil-structure interaction, uncertainties

Procedia PDF Downloads 238
15790 Numerical Modeling and Prediction of Nanoscale Transport Phenomena in Vertically Aligned Carbon Nanotube Catalyst Layers by the Lattice Boltzmann Simulation

Authors: Seungho Shin, Keunwoo Choi, Ali Akbar, Sukkee Um

Abstract:

In this study, the nanoscale transport properties and catalyst utilization of vertically aligned carbon nanotube (VACNT) catalyst layers are computationally predicted by the three-dimensional lattice Boltzmann simulation based on the quasi-random nanostructural model in pursuance of fuel cell catalyst performance improvement. A series of catalyst layers are randomly generated with statistical significance at the 95% confidence level to reflect the heterogeneity of the catalyst layer nanostructures. The nanoscale gas transport phenomena inside the catalyst layers are simulated by the D3Q19 (i.e., three-dimensional, 19 velocities) lattice Boltzmann method, and the corresponding mass transport characteristics are mathematically modeled in terms of structural properties. Considering the nanoscale reactant transport phenomena, a transport-based effective catalyst utilization factor is defined and statistically analyzed to determine the structure-transport influence on catalyst utilization. The tortuosity of the reactant mass transport path of VACNT catalyst layers is directly calculated from the streaklines. Subsequently, the corresponding effective mass diffusion coefficient is statistically predicted by applying the pre-estimated tortuosity factors to the Knudsen diffusion coefficient in the VACNT catalyst layers. The statistical estimation results clearly indicate that the morphological structures of VACNT catalyst layers reduce the tortuosity of reactant mass transport path when compared to conventional catalyst layer and significantly improve consequential effective mass diffusion coefficient of VACNT catalyst layer. Furthermore, catalyst utilization of the VACNT catalyst layer is substantially improved by enhanced mass diffusion and electric current paths despite the relatively poor interconnections of the ion transport paths.

Keywords: Lattice Boltzmann method, nano transport phenomena, polymer electrolyte fuel cells, vertically aligned carbon nanotube

Procedia PDF Downloads 201
15789 UEMSD Risk Identification: Case Study

Authors: K. Sekulová, M. Šimon

Abstract:

The article demonstrates on a case study how it is possible to identify MSD risk. It is based on a dissertation risk identification model of occupational diseases formation in relation to the work activity that determines what risk can endanger workers who are exposed to the specific risk factors. It is evaluated based on statistical calculations. These risk factors are main cause of upper-extremities musculoskeletal disorders.

Keywords: case study, upper-extremity musculoskeletal disorders, ergonomics, risk identification

Procedia PDF Downloads 502
15788 Legal Allocation of Risks: A Computational Analysis of Force Majeure Clauses

Authors: Farshad Ghodoosi

Abstract:

This article analyzes the effect of supervening events in contracts. Contracts serve an important function: allocation of risks. In spite of its importance, the case law and the doctrine are messy and inconsistent. This article provides a fresh look at excuse doctrines (i.e., force majeure, impracticability, impossibility, and frustration) with a focus on force majeure clauses. The article makes the following contributions: First, it furnishes a new conceptual and theoretical framework of excuse doctrines. By distilling the decisions, it shows that excuse doctrines rests on the triangle of control, foreseeability, and contract language. Second, it analyzes force majeure clauses used by S&P 500 companies to understand the stickiness and similarity of such clauses and the events they cover. Third, using computational and statistical tools, it analyzes US cases since 1810 in order to assess the weight given to the triangle of control, foreseeability, and contract language. It shows that the control factor plays an important role in force majeure analysis, while the contractual interpretation is the least important factor. The Article concludes that it is the standard for control -whether the supervening event is beyond the control of the party- that determines the outcome of cases in the force majeure context and not necessarily the contractual language. This article has important implications on COVID-19-related contractual cases. Unlike the prevailing narrative that it is the language of the force majeure clause that’s determinative, this article shows that the primarily focus of the inquiry will be on whether the effects of COVID-19 have been beyond the control of the promisee. Normatively, the Article suggests that the trifactor of control, foreseeability, and contractual language are not effective for allocation of legal risks in times of crises. It puts forward a novel approach to force majeure clauses whereby that the courts should instead focus on the degree to which parties have relied on (expected) performance, in particular during the time of crisis.

Keywords: contractual risks, force majeure clauses, foreseeability, control, contractual language, computational analysis

Procedia PDF Downloads 152
15787 Development of DNDC Modelling Method for Evaluation of Carbon Dioxide Emission from Arable Soils in European Russia

Authors: Olga Sukhoveeva

Abstract:

Carbon dioxide (CO2) is the main component of carbon biogeochemical cycle and one of the most important greenhouse gases (GHG). Agriculture, particularly arable soils, are one the largest sources of GHG emission for the atmosphere including CO2.Models may be used for estimation of GHG emission from agriculture if they can be adapted for different countries conditions. The only model used in officially at national level in United Kingdom and China for this purpose is DNDC (DeNitrification-DeComposition). In our research, the model DNDC is offered for estimation of GHG emission from arable soils in Russia. The aim of our research was to create the method of DNDC using for evaluation of CO2 emission in Russia based on official statistical information. The target territory was European part of Russia where many field experiments are located. At the first step of research the database on climate, soil and cropping characteristics for the target region from governmental, statistical, and literature sources were created. All-Russia Research Institute of Hydrometeorological Information – World Data Centre provides open daily data about average meteorological and climatic conditions. It must be calculated spatial average values of maximum and minimum air temperature and precipitation over the region. Spatial average values of soil characteristics (soil texture, bulk density, pH, soil organic carbon content) can be determined on the base of Union state register of soil recourses of Russia. Cropping technologies are published by agricultural research institutes and departments. We offer to define cropping system parameters (annual information about crop yields, amount and types of fertilizers and manure) on the base of the Federal State Statistics Service data. Content of carbon in plant biomass may be calculated via formulas developed and published by Ministry of Natural Resources and Environment of the Russian Federation. At the second step CO2 emission from soil in this region were calculated by DNDC. Modelling data were compared with empirical and literature data and good results were obtained, modelled values were equivalent to the measured ones. It was revealed that the DNDC model may be used to evaluate and forecast the CO2 emission from arable soils in Russia based on the official statistical information. Also, it can be used for creation of the program for decreasing GHG emission from arable soils to the atmosphere. Financial Support: fundamental scientific researching theme 0148-2014-0005 No 01201352499 ‘Solution of fundamental problems of analysis and forecast of Earth climatic system condition’ for 2014-2020; fundamental research program of Presidium of RAS No 51 ‘Climate change: causes, risks, consequences, problems of adaptation and regulation’ for 2018-2020.

Keywords: arable soils, carbon dioxide emission, DNDC model, European Russia

Procedia PDF Downloads 192
15786 Electrophysiological Correlates of Statistical Learning in Children with and without Developmental Language Disorder

Authors: Ana Paula Soares, Alexandrina Lages, Helena Oliveira, Francisco-Javier Gutiérrez-Domínguez, Marisa Lousada

Abstract:

From an early age, exposure to a spoken language allows us to implicitly capture the structure underlying the succession of the speech sounds in that language and to segment it into meaningful units (words). Statistical learning (SL), i.e., the ability to pick up patterns in the sensory environment even without intention or consciousness of doing it, is thus assumed to play a central role in the acquisition of the rule-governed aspects of language and possibly to lie behind the language difficulties exhibited by children with development language disorder (DLD). The research conducted so far has, however, led to inconsistent results, which might stem from the behavioral tasks used to test SL. In a classic SL experiment, participants are first exposed to a continuous stream (e.g., syllables) in which, unbeknownst to the participants, stimuli are grouped into triplets that always appear together in the stream (e.g., ‘tokibu’, ‘tipolu’), with no pauses between each other (e.g., ‘tokibutipolugopilatokibu’) and without any information regarding the task or the stimuli. Following exposure, SL is assessed by asking participants to discriminate between triplets previously presented (‘tokibu’) from new sequences never presented together during exposure (‘kipopi’), i.e., to perform a two-alternative-forced-choice (2-AFC) task. Despite the widespread use of the 2-AFC to test SL, it has come under increasing criticism as it is an offline post-learning task that only assesses the result of the learning that had occurred during the previous exposure phase and that might be affected by other factors beyond the computation of regularities embedded in the input, typically the likelihood two syllables occurring together, a statistic known as transitional probability (TP). One solution to overcome these limitations is to assess SL as exposure to the stream unfolds using online techniques such as event-related potentials (ERP) that is highly sensitive to the time-course of the learning in the brain. Here we collected ERPs to examine the neurofunctional correlates of SL in preschool children with DLD, and chronological-age typical language development (TLD) controls who were exposed to an auditory stream in which eight three-syllable nonsense words, four of which presenting high-TPs and the other four low-TPs, to further analyze whether the ability of DLD and TLD children to extract-word-like units from the steam was modulated by words’ predictability. Moreover, to ascertain if the previous knowledge of the to-be-learned-regularities affected the neural responses to high- and low-TP words, children performed the auditory SL task, firstly, under implicit, and, subsequently, under explicit conditions. Although behavioral evidence of SL was not obtained in either group, the neural responses elicited during the exposure phases of the SL tasks differentiated children with DLD from children with TLD. Specifically, the results indicated that only children from the TDL group showed neural evidence of SL, particularly in the SL task performed under explicit conditions, firstly, for the low-TP, and, subsequently, for the high-TP ‘words’. Taken together, these findings support the view that children with DLD showed deficits in the extraction of the regularities embedded in the auditory input which might underlie the language difficulties.

Keywords: development language disorder, statistical learning, transitional probabilities, word segmentation

Procedia PDF Downloads 188
15785 Systems Engineering Management Using Transdisciplinary Quality System Development Lifecycle Model

Authors: Mohamed Asaad Abdelrazek, Amir Taher El-Sheikh, M. Zayan, A.M. Elhady

Abstract:

The successful realization of complex systems is dependent not only on the technology issues and the process for implementing them, but on the management issues as well. Managing the systems development lifecycle requires technical management. Systems engineering management is the technical management. Systems engineering management is accomplished by incorporating many activities. The three major activities are development phasing, systems engineering process and lifecycle integration. Systems engineering management activities are performed across the system development lifecycle. Due to the ever-increasing complexity of systems as well the difficulty of managing and tracking the development activities, new ways to achieve systems engineering management activities are required. This paper presents a systematic approach used as a design management tool applied across systems engineering management roles. In this approach, Transdisciplinary System Development Lifecycle (TSDL) Model has been modified and integrated with Quality Function Deployment. Hereinafter, the name of the systematic approach is the Transdisciplinary Quality System Development Lifecycle (TQSDL) Model. The QFD translates the voice of customers (VOC) into measurable technical characteristics. The modified TSDL model is based on Axiomatic Design developed by Suh which is applicable to all designs: products, processes, systems and organizations. The TQSDL model aims to provide a robust structure and systematic thinking to support the implementation of systems engineering management roles. This approach ensures that the customer requirements are fulfilled as well as satisfies all the systems engineering manager roles and activities.

Keywords: axiomatic design, quality function deployment, systems engineering management, system development lifecycle

Procedia PDF Downloads 364
15784 Raising Multilingual Awareness towards Plurilingual Competence Development: Through Which Approach and Which Pedagogical Material-A Case Study in the Greek Primary Education

Authors: Eftychia Damaskou

Abstract:

This article intends to place the question of the adequate approach for teaching multilingualism within the public education. Linguistic education, as it is defined by the Common European Framework of Reference for the Languages, is no longer the proficiency in one or two languages. It’s about the development of a linguistic repertoire, where all linguistic skills find their place. In fact, the linguistic theories that frame the development of plurilingual competence point out the affective and intercultural aspect of such a process, insisting on an awareness of linguistic diversification, rather than an acquisition of communicative competence in many languages. In this spirit, our article attempts to go beyond a mere plurilingual awareness, present a research based on an experience in class, within 115 pupils, aiming at the development of plurilingual competence in five unknown foreign languages. This experience was held through a teaching unit personally conceived and applied, and consisted of a series of 6 activities based on a cross-linguistic content approach. The data analysis proves to be very interesting, as it reveals the development of plurilingual competences, as well as positive attitudes towards less common languages by the majority of our sample.

Keywords: multilingual awareness, multilingual teaching material, plurilingual competence

Procedia PDF Downloads 454
15783 Global Best Practice Paradox; the Failure of One Size Fits All Approach to Development a Case Study of Pakistan

Authors: Muhammad Naveed Iftikhar, Farah Khalid

Abstract:

Global best practices as ordained by international organizations comprise a broader top-down approach to development problems, without taking into account country-specific factors. The political economy of each country is extremely different and the failure of several attempts of international organizations to implement global best practice models in developing countries each with its unique set of variables, goes on to show that this is not the most efficient solution to development problems. This paper is a humble attempt at shedding light on some specific examples of failures of the global best practices. Pakistan has its unique set of problems and unless those are added to the broader equation of development, country-specific reform and growth will continue to pose a challenge to reform programs initiated by international organizations. The three case studies presented in this paper are just a few prominent examples of failure of the global best practice, top-down, universalistic approach to development as ordained by international organizations. Development and reform can only be achieved if local dynamics are given their due importance. The modus operandi of international organizations needs to be tailored according to each country’s unique politico-economic environment.

Keywords: best practice, development, context

Procedia PDF Downloads 474
15782 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 93
15781 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 90
15780 Design for Safety: Safety Consideration in Planning and Design of Airport Airsides

Authors: Maithem Al-Saadi, Min An

Abstract:

During airport planning and design stages, the major issues of capacity and safety in construction and operation of an airport need to be taken into consideration. The airside of an airport is a major and critical infrastructure that usually consists of runway(s), taxiway system, and apron(s) etc., which have to be designed according to the international standards and recommendations, and local limitations to accommodate the forecasted demands. However, in many cases, airport airsides are suffering from unexpected risks that occurred during airport operations. Therefore, safety risk assessment should be applied in the planning and design of airsides to cope with the probability of risks and their consequences, and to make decisions to reduce the risks to as low as reasonably practicable (ALARP) based on safety risk assessment. This paper presents a combination approach of Failure Modes, Effect, and Criticality Analysis (FMECA), Fuzzy Reasoning Approach (FRA), and Fuzzy Analytic Hierarchy Process (FAHP) to develop a risk analysis model for safety risk assessment. An illustrated example is used to the demonstrate risk assessment process on how the design of an airside in an airport can be analysed by using the proposed safety design risk assessment model.

Keywords: airport airside planning and design, design for safety, fuzzy reasoning approach, fuzzy AHP, risk assessment

Procedia PDF Downloads 368
15779 Adversarial Disentanglement Using Latent Classifier for Pose-Independent Representation

Authors: Hamed Alqahtani, Manolya Kavakli-Thorne

Abstract:

The large pose discrepancy is one of the critical challenges in face recognition during video surveillance. Due to the entanglement of pose attributes with identity information, the conventional approaches for pose-independent representation lack in providing quality results in recognizing largely posed faces. In this paper, we propose a practical approach to disentangle the pose attribute from the identity information followed by synthesis of a face using a classifier network in latent space. The proposed approach employs a modified generative adversarial network framework consisting of an encoder-decoder structure embedded with a classifier in manifold space for carrying out factorization on the latent encoding. It can be further generalized to other face and non-face attributes for real-life video frames containing faces with significant attribute variations. Experimental results and comparison with state of the art in the field prove that the learned representation of the proposed approach synthesizes more compelling perceptual images through a combination of adversarial and classification losses.

Keywords: disentanglement, face detection, generative adversarial networks, video surveillance

Procedia PDF Downloads 130
15778 Advanced Statistical Approaches for Identifying Predictors of Poor Blood Pressure Control: A Comprehensive Analysis Using Multivariable Logistic Regression and Generalized Estimating Equations (GEE)

Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei

Abstract:

Effective management of hypertension remains a critical public health challenge, particularly among racially and ethnically diverse populations. This study employs sophisticated statistical models to rigorously investigate the predictors of poor blood pressure (BP) control, with a specific focus on demographic, socioeconomic, and clinical risk factors. Leveraging a large sample of 19,253 adults drawn from the National Health and Nutrition Examination Survey (NHANES) across three distinct time periods (2013-2014, 2015-2016, and 2017-2020), we applied multivariable logistic regression and generalized estimating equations (GEE) to account for the clustered structure of the data and potential within-subject correlations. Our multivariable models identified significant associations between poor BP control and several key predictors, including race/ethnicity, age, gender, body mass index (BMI), prevalent diabetes, and chronic kidney disease (CKD). Non-Hispanic Black individuals consistently exhibited higher odds of poor BP control across all periods (OR = 1.99; 95% CI: 1.69, 2.36 for the overall sample; OR = 2.33; 95% CI: 1.79, 3.02 for 2017-2020). Younger age groups demonstrated substantially lower odds of poor BP control compared to individuals aged 75 and older (OR = 0.15; 95% CI: 0.11, 0.20 for ages 18-44). Men also had a higher likelihood of poor BP control relative to women (OR = 1.55; 95% CI: 1.31, 1.82), while BMI ≥35 kg/m² (OR = 1.76; 95% CI: 1.40, 2.20) and the presence of diabetes (OR = 2.20; 95% CI: 1.80, 2.68) were associated with increased odds of poor BP management. Further analysis using GEE models, accounting for temporal correlations and repeated measures, confirmed the robustness of these findings. Notably, individuals with chronic kidney disease displayed markedly elevated odds of poor BP control (OR = 3.72; 95% CI: 3.09, 4.48), with significant differences across the survey periods. Additionally, higher education levels and better self-reported diet quality were associated with improved BP control. College graduates exhibited a reduced likelihood of poor BP control (OR = 0.64; 95% CI: 0.46, 0.89), particularly in the 2015-2016 period (OR = 0.48; 95% CI: 0.28, 0.84). Similarly, excellent dietary habits were associated with significantly lower odds of poor BP control (OR = 0.64; 95% CI: 0.44, 0.94), underscoring the importance of lifestyle factors in hypertension management. In conclusion, our findings provide compelling evidence of the complex interplay between demographic, clinical, and socioeconomic factors in predicting poor BP control. The application of advanced statistical techniques such as GEE enhances the reliability of these results by addressing the correlated nature of repeated observations. This study highlights the need for targeted interventions that consider racial/ethnic disparities, clinical comorbidities, and lifestyle modifications in improving BP control outcomes.

Keywords: hypertension, blood pressure, NHANES, generalized estimating equations

Procedia PDF Downloads 16
15777 The Role of the Injured Party's Fault in the Apportionment of Damages in Tort Law: A Comparative-Historical Study between Common Law and Islamic Law

Authors: Alireza Tavakoli Nia

Abstract:

In order to understand the role of the injured party's fault in dividing liability, we studied its historical background. In common law, the traditional contributory negligence rule was a complete defense. Then the legislature and judicial procedure modified that rule to one of apportionment. In Islamic law, too, the Action rule was at first used when the injured party was the sole cause, but jurists expanded the scope of this rule, so this rule was used in cases where both the injured party's fault and that of the other party are involved. There are some popular approaches for apportionment of damages. Some common law countries like Britain had chosen ‘the causal potency approach’ and ‘fixed apportionment’. Islamic countries like Iran have chosen both ‘the relative blameworthiness’ and ‘equal apportionment’ approaches. The article concludes that both common law and Islamic law believe in the division of responsibility between a wrongdoer claimant and the defendant. In contrast, in the apportionment of responsibility, Islamic law mostly believes in equal apportionment that is way easier and saves time and money, but common law legal systems have chosen the causal potency approach, which is more complicated than the rival approach but is fairer.

Keywords: contributory negligence, tort law, damage apportionment, common law, Islamic law

Procedia PDF Downloads 148
15776 Compilation and Statistical Analysis of an Arabic-English Legal Corpus in Sketch Engine

Authors: C. Brierley, H. El-Farahaty, A. Farhan

Abstract:

The Leeds Parallel Corpus of Arabic-English Constitutions is a parallel corpus for the Arabic legal domain. Analysis of legal language via Corpus Linguistics techniques is an important development. In legal proceedings, a corpus-based approach to disambiguating meaning is set to replace the dictionary as an interpretative tool, and legal scholarship in the States is now attuned to the potential for Text Analytics over vast quantities of text-based legal material, following the business and medical industries. This trend is reflected in Europe: the interdisciplinary research group in Computer Assisted Legal Linguistics mines big data collections of legal and non-legal texts to analyse: legal interpretations; legal discourse; the comprehensibility of legal texts; conflict resolution; and linguistic human rights. This paper focuses on ‘dignity’ as an important aspect of the overarching concept of human rights in current constitutions across the Arab world. We have compiled a parallel, Arabic-English raw text corpus (169,861 Arabic words and 205,893 English words) from reputable websites such as the World Intellectual Property Organisation and CONSTITUTE, and uploaded and queried our corpus in Sketch Engine. Our most challenging task was sentence-level alignment of Arabic-English data. This entailed manual intervention to ensure correspondence on a one-to-many basis since Arabic sentences differ from English in length and punctuation. We have searched for morphological variants of ‘dignity’ (رامة ك, karāma) in the Arabic data and inspected their English translation equivalents. The term occurs most frequently in the Sudanese constitution (10 instances), and not at all in the constitution of Palestine. Its most frequent collocate, determined via the logDice statistic in Sketch Engine, is ‘human’ as in ‘human dignity’.

Keywords: Arabic constitution, corpus-based legal linguistics, human rights, parallel Arabic-English legal corpora

Procedia PDF Downloads 183
15775 The Influence of Market Attractiveness and Core Competence on Value Creation Strategy and Competitive Advantage and Its Implication on Business Performance

Authors: Firsan Nova

Abstract:

The average Indonesian watches 5.5 hours of TV a day. With a population of 242 million people and a Free-to-Air (FTA) TV penetration rate of 56%, that equates to 745 million hours of television watched each day. With such potential, it is no wonder that many companies are now attempting to get into the Pay TV market. Research firm Media Partner Asia has forecast in its study that the number of Indonesian pay-television subscribers will climb from 2.4 million in 2012 to 8.7 million by 2020, with penetration scaling up from 7 percent to 21 percent. Key drivers of market growth, the study says, include macro trends built around higher disposable income and a rising middle class, with leading players continuing to invest significantly in sales, distribution and content. New entrants, in the meantime, will boost overall prospects. This study aims to examine and analyze the effect of Market Attractiveness and the Core Competence on Value Creation and Competitive Advantage and its impact to Business Performance in the pay TV industry in Indonesia. The study using strategic management science approach with the census method in which all members of the population are as sample. Verification method is used to examine the relationship between variables. The unit of analysis in this research is all Indonesian Pay TV business units totaling 19 business units. The unit of observation is the director and managers of each business unit. Hypothesis testing is performed by using statistical Partial Least Square (PLS). The conclusion of the study shows that the market attractiveness affects business performance through value creation and competitive advantage. The appropriate value creation comes from the company ability to optimize its core competence and exploit market attractiveness. Value creation affects competitive advantage. The competitive advantage can be determined based on the company's ability to create value for customers and the competitive advantage has an impact on business performance.

Keywords: market attractiveness, core competence, value creation, competitive advantage, business performance

Procedia PDF Downloads 350
15774 Discursive Legitimation Strategies in ISIS’ Online Magazine, Dabiq: A Discourse Historical Approach

Authors: Sahar Rasoulikolamaki

Abstract:

ISIS (also known as DAASH) is an Islamic fundamentalist group that has been known as a global threat to the whole world for their radicalizing approach and application of online platforms as a tool to portray their activities, to disseminate their ideology, and to commit recruiting activities. This study is an attempt to carry out a critical discourse analysis on the argumentative devices by which ISIS legitimizes or delegitimizes positive or negative constructions of social practices in Dabiq. It tries to shed light on how texts in Dabiq as linguistic elements in the micro level of analysis relate to ISIS’ ideology as the higher-up macro level and in other words, how local structures contributed to the construction and transference of a global structure or ideology and vice versa. Therefore, following the relevant analytical frameworks, the study focuses on both micro-level of analysis of arguments (topoi) and macro-structure of legitimation and delegitimation in Dabiq. This purpose is nailed using the analytical categories and tools provided by Wodak’s Discourse Historical Approach (DHA) such as argumentation strategies (topoi), by which the coded language of legitimation/delegitimation and persuasion as used in Dabiq are explored. The ensuing findings demonstrate that Dabiq rigorously relies on the positive representation of the in-group course of actions and justifying its violence and, at the same time, the negative representation of the out-group behavior through implementing various topoi to achieve its desired outcome, which is the ideological manipulation and powerful self-depiction, as well as the supporter recruitment.

Keywords: argumentation, discourse-historical approach, ideology, legitimation and delegitimation, topoi

Procedia PDF Downloads 136
15773 Arabic as a Foreign Language in the Curriculum of Higher Education in Nigeria: Problems, Solutions, and Prospects

Authors: Kazeem Oluwatoyin Ajape

Abstract:

The study is concerned with the problem of how to improve the teaching of Arabic as a foreign language in Nigerian Higher Education System. The paper traces the historical background of Arabic education in Nigeria and also outlines the problems facing the language in Nigerian Institutions. It lays down some of the essential foundation work necessary for bringing about systematic and constructive improvements in the Teaching of Arabic as a Foreign Language (TAFL) by giving answers to the following research questions: what is the appropriate medium of instruction in teaching a foreign or second language? What is the position of English language in the teaching and learning of Arabic/Islamic education? What is the relevance of the present curriculum of Arabic /Islamic education in Nigerian institutions to the contemporary society? A survey of the literature indicates that a revolution is currently taking place in FL teaching and that a new approach known as the Communicative Approach (CA), has begun to emerge and influence the teaching of FLs in general, over the last decade or so. Since the CA is currently being adapted to the teaching of most major FLs and since this revolution has not yet had much impact on TAPL, the study explores the possibility of the application of the CA to the teaching of Arabic as a living language and also makes recommendations towards the development of the language in Nigerian Institutions of Higher Learning.

Keywords: Arabic Language, foreign language, Nigerian institutions, curriculum, communicative approach

Procedia PDF Downloads 613
15772 Self-Image of Police Officers

Authors: Leo Carlo B. Rondina

Abstract:

Self-image is an important factor to improve the self-esteem of the personnel. The purpose of the study is to determine the self-image of the police. The respondents were the 503 policemen assigned in different Police Station in Davao City, and they were chosen with the used of random sampling. With the used of Exploratory Factor Analysis (EFA), latent construct variables of police image were identified as follows; professionalism, obedience, morality and justice and fairness. Further, ordinal regression indicates statistical characteristics on ages 21-40 which means the age of the respondent statistically improves self-image.

Keywords: police image, exploratory factor analysis, ordinal regression, Galatea effect

Procedia PDF Downloads 289
15771 Social Identification among Employees: A System Dynamic Approach

Authors: Muhammad Abdullah, Salman Iqbal, Mamoona Rasheed

Abstract:

Social identity among people is an important source of pride and self-esteem, consequently, people struggle to preserve a positive perception of their groups and collectives. The purpose of this paper is to explain the process of social identification and to highlight the underlying causal factors of social identity among employees. There is a little research about how the social identity of employees is shaped in Pakistan’s organizational culture. This study is based on social identity theory. This study uses Systems’ approach as a research methodology. The feedback loop approach is applied to explain the underlying key elements of employee behavior that collectively form social identity among social groups in corporate arena. The findings of this study reveal that effective, evaluative and cognitive components of an individual’s personality are associated with the social identification. The system dynamic feedback loop approach has revealed the underlying structure that is associated with social identity, social group formation, and effective component proved to be the most associated factor. This may also enable to understand how social groups become stable and individuals act according to the group requirements. The value of this paper lies in the understanding gained about the underlying key factors that play a crucial role in social group formation in organizations. It may help to understand the rationale behind how employees socially categorize themselves within organizations. It may also help to design effective and more cohesive teams for better operations and long-term results. This may help to share knowledge among employees as well. The underlying structure behind the social identification is highlighted with the help of system modeling.

Keywords: affective commitment, cognitive commitment, evaluated commitment, system thinking

Procedia PDF Downloads 138
15770 Cognitive Function and Coping Behavior in the Elderly: A Population-Based Cross-Sectional Study

Authors: Ryo Shikimoto, Hidehito Niimura, Hisashi Kida, Kota Suzuki, Yukiko Miyasaka, Masaru Mimura

Abstract:

Introduction: In Japan, the most aged country in the world, it is important to explore predictive factors of cognitive function among the elderly. Coping behavior relieves chronic stress and improves lifestyle, and consequently may reduce the risk of cognitive impairment. One of the most widely investigated frameworks evaluated in previous studies is approach-oriented and avoidance-oriented coping strategies. The purpose of this study is to investigate the relationship between cognitive function and coping strategies among elderly residents in urban areas of Japan. Method: This is a part of the cross-sectional Arakawa geriatric cohort study for 1,099 residents (aged 65 to 86 years; mean [SD] = 72.9 [5.2]). Participants were assessed for cognitive function using the Mini-Mental State Examination (MMSE) and diagnosed by psychiatrists in face-to-face interviews. They were then investigated for their each coping behaviors and coping strategies (approach- and avoidance-oriented coping) using stress and coping inventory. A multiple regression analysis was used to investigate the relationship between MMSE score and each coping strategy. Results: Of the 1,099 patients, the mean MMSE score of the study participants was 27.2 (SD = 2.7), and the numbers of the diagnosis of normal, mild cognitive impairment (MCI), and dementia were 815 (74.2%), 248 (22.6%), and 14 (1.3%), respectively. Approach-oriented coping score was significantly associated with MMSE score (B [partial regression coefficient] = 0.12, 95% confidence interval = 0.05 to 0.19) after adjusting for confounding factors including age, sex, and education. Avoidance-oriented coping did not show a significant association with MMSE score (B [partial regression coefficient] = -0.02, 95% confidence interval = -0.09 to 0.06). Conclusion: Approach-oriented coping was clearly associated with neurocognitive function in the Japanese population. A future longitudinal trial is warranted to investigate the protective effects of coping behavior on cognitive function.

Keywords: approach-oriented coping, cognitive impairment, coping behavior, dementia

Procedia PDF Downloads 131
15769 Recent Climate Variability and Crop Production in the Central Highlands of Ethiopia

Authors: Arragaw Alemayehu, Woldeamlak Bewket

Abstract:

The aim of this study was to understand the influence of current climate variability on crop production in the central highlands of Ethiopia. We used monthly rainfall and temperature data from 132 points each representing a pixel of 10×10 km. The data are reconstructions based on station records and meteorological satellite observations. Production data of the five major crops in the area were collected from the Central Statistical Agency for the period 2004-2013 and for the main cropping season, locally known as Meher. The production data are at the Enumeration Area (EA ) level and hence the best available dataset on crop production. The results show statistically significant decreasing trends in March–May (Belg) rainfall in the area. However, June – September (Kiremt) rainfall showed increasing trends in Efratana Gidim and Menz Gera Meder which the latter is statistically significant. Annual rainfall also showed positive trends in the area except Basona Werana where significant negative trends were observed. On the other hand, maximum and minimum temperatures showed warming trends in the study area. Correlation results have shown that crop production and area of cultivation have positive correlation with rainfall, and negative with temperature. When the trends in crop production are investigated, most crops showed negative trends and below average production was observed. Regression results have shown that rainfall was the most important determinant of crop production in the area. It is concluded that current climate variability has a significant influence on crop production in the area and any unfavorable change in the local climate in the future will have serious implications for household level food security. Efforts to adapt to the ongoing climate change should begin from tackling the current climate variability and take a climate risk management approach.

Keywords: central highlands, climate variability, crop production, Ethiopia, regression, trend

Procedia PDF Downloads 438
15768 Experimental Study on Performance of a Planar Membrane Humidifier for a Proton Exchange Membrane Fuel Cell Stack

Authors: Chen-Yu Chen, Wei-Mon Yan, Chi-Nan Lai, Jian-Hao Su

Abstract:

The proton exchange membrane fuel cell (PEMFC) becomes more important as an alternative energy source recently. Maintaining proper water content in the membrane is one of the key requirements for optimizing the PEMFC performance. The planar membrane humidifier has the advantages of simple structure, low cost, low-pressure drop, light weight, reliable performance and good gas separability. Thus, it is a common external humidifier for PEMFCs. In this work, a planar membrane humidifier for kW-scale PEMFCs is developed successfully. The heat and mass transfer of humidifier is discussed, and its performance is analyzed in term of dew point approach temperature (DPAT), water vapor transfer rate (WVTR) and water recovery ratio (WRR). The DPAT of the humidifier with the counter flow approach reaches about 6°C under inlet dry air of 50°C and 60% RH and inlet humid air of 70°C and 100% RH. The rate of pressure loss of the humidifier is 5.0×10² Pa/min at the torque of 7 N-m, which reaches the standard of commercial planar membrane humidifiers. From the tests, it is found that increasing the air flow rate increases the WVTR. However, the DPAT and the WRR are not improved by increasing the WVTR as the air flow rate is higher than the optimal value. In addition, increasing the inlet temperature or the humidity of dry air decreases the WVTR and the WRR. Nevertheless, the DPAT is improved at elevated inlet temperatures or humidities of dry air. Furthermore, the performance of the humidifier with the counter flow approach is better than that with the parallel flow approach. The DPAT difference between the two flow approaches reaches up to 8 °C.

Keywords: heat and mass transfer, humidifier performance, PEM fuel cell, planar membrane humidifier

Procedia PDF Downloads 307
15767 Evaluation of QSRR Models by Sum of Ranking Differences Approach: A Case Study of Prediction of Chromatographic Behavior of Pesticides

Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević

Abstract:

The present study deals with the selection of the most suitable quantitative structure-retention relationship (QSRR) models which should be used in prediction of the retention behavior of basic, neutral, acidic and phenolic pesticides which belong to different classes: fungicides, herbicides, metabolites, insecticides and plant growth regulators. Sum of ranking differences (SRD) approach can give a different point of view on selection of the most consistent QSRR model. SRD approach can be applied not only for ranking of the QSRR models, but also for detection of similarity or dissimilarity among them. Applying the SRD analysis, the most similar models can be found easily. In this study, selection of the best model was carried out on the basis of the reference ranking (“golden standard”) which was defined as the row average values of logarithm of retention time (logtr) defined by high performance liquid chromatography (HPLC). Also, SRD analysis based on experimental logtr values as reference ranking revealed similar grouping of the established QSRR models already obtained by hierarchical cluster analysis (HCA).

Keywords: chemometrics, chromatography, pesticides, sum of ranking differences

Procedia PDF Downloads 375
15766 Artificial Intelligence Aided Improvement in Canada's Supply Chain Management

Authors: Mohammad Talebi

Abstract:

Supply chain administration could be a concern for all the countries within the world, whereas there's no special approach towards supportability. Generally, for one decade, manufactured insights applications in keen supply chains have found a key part. In this paper, applications of artificial intelligence in supply chain management have been clarified, and towards Canadian plans for smart supply chain management (SCM), a few notes have been suggested. A hierarchical framework for smart SCM might provide a great roadmap for decision-makers to find the most appropriate approach toward smart SCM. Within the system of decision-making, all the levels included in the accomplishment of smart SCM are included. In any case, more considerations are got to be paid to available and needed infrastructures.

Keywords: smart SCM, AI, SSCM, procurement

Procedia PDF Downloads 89
15765 A Neural Network Approach to Evaluate Supplier Efficiency in a Supply Chain

Authors: Kishore K. Pochampally

Abstract:

The success of a supply chain heavily relies on the efficiency of the suppliers involved. In this paper, we propose a neural network approach to evaluate the efficiency of a supplier, which is being considered for inclusion in a supply chain, using the available linguistic (fuzzy) data of suppliers that already exist in the supply chain. The approach is carried out in three phases, as follows: In phase one, we identify criteria for evaluation of the supplier of interest. Then, in phase two, we use performance measures of already existing suppliers to construct a neural network that gives weights (importance values) of criteria identified in phase one. Finally, in phase three, we calculate the overall rating of the supplier of interest. The following are the major findings of the research conducted for this paper: (i) linguistic (fuzzy) ratings of suppliers such as 'good', 'bad', etc., can be converted (defuzzified) to numerical ratings (1 – 10 scale) using fuzzy logic so that those ratings can be used for further quantitative analysis; (ii) it is possible to construct and train a multi-level neural network in order to determine the weights of the criteria that are used to evaluate a supplier; and (iii) Borda’s rule can be used to group the weighted ratings and calculate the overall efficiency of the supplier.

Keywords: fuzzy data, neural network, supplier, supply chain

Procedia PDF Downloads 115
15764 Preserving Heritage in the Face of Natural Disasters: Lessons from the Bam Experience in Iran

Authors: Mohammad Javad Seddighi, Avar Almukhtar

Abstract:

The occurrence of natural disasters, such as floods and earthquakes, can cause significant damage to heritage sites and surrounding areas. In Iran, the city of Bam was devastated by an earthquake in 2003, which had a major impact on the rivers and watercourses around the city. This study aims to investigate the environmental design techniques and sustainable hazard mitigation strategies that can be employed to preserve heritage sites in the face of natural disasters, using the Bam experience as a case study. The research employs a mixed-methods approach, combining both qualitative and quantitative data collection and analysis methods. The study begins with a comprehensive literature review of recent publications on environmental design techniques and sustainable hazard mitigation strategies in heritage conservation. This is followed by a field study of the rivers and watercourses around Bam, including the Adoori River (Talangoo) and other watercourses, to assess the current conditions and identify potential hazards. The data collected from the field study is analysed using statistical methods and GIS mapping techniques. The findings of this study reveal the importance of sustainable hazard mitigation strategies and environmental design techniques in preserving heritage sites during natural disasters. The study suggests that these techniques can be used to prevent the outbreak of another natural disaster in Bam and the surrounding areas. Specifically, the study recommends the establishment of a comprehensive early warning system, the creation of flood-resistant landscapes, and the use of eco-friendly building materials in the reconstruction of heritage sites. These findings contribute to the current knowledge of sustainable hazard mitigation and environmental design in heritage conservation.

Keywords: natural disasters, heritage conservation, sustainable hazard mitigation, environmental design, landscape architecture, flood management, disaster resilience

Procedia PDF Downloads 88