Search results for: uncertainty principle
1722 Migration in Times of Uncertainty
Authors: Harman Jaggi, David Steinsaltz, Shripad Tuljapurkar
Abstract:
Understanding the effect of fluctuations on populations is crucial in the context of increasing habitat fragmentation, climate change, and biological invasions, among others. Migration in response to environmental disturbances enables populations to escape unfavorable conditions, benefit from new environments and thereby ride out fluctuations in variable environments. Would populations disperse if there is no uncertainty? Karlin showed in 1982 that when sub-populations experience distinct but fixed growth rates at different sites, greater mixing of populations will lower the overall growth rate relative to the most favorable site. Here we ask if and when environmental variability favors migration over no-migration. Specifically, in random environments, would a small amount of migration increase the overall long-run growth rate relative to the zero migration case? We use analysis and simulations to show how long-run growth rate changes with migration rate. Our results show that when fitness (dis)advantages fluctuate over time across sites, migration may allow populations to benefit from variability. When there is one best site with highest growth rate, the effect of migration on long-run growth rate depends on the difference in expected growth between sites, scaled by the variance of the difference. When variance is large, there is a substantial probability of an inferior site experiencing higher growth rate than its average. Thus, a high variance can compensate for a difference in average growth rates between sites. Positive correlations in growth rates across sites favor less migration. With multiple sites and large fluctuations, the length of shortest cycle (excursion) from the best site (on average) matters, and we explore the interplay between excursion length, average differences between sites and the size of fluctuations. Our findings have implications for conservation biology: even when there are superior sites in a sea of poor habitats, variability and habitat quality across space may be key to determining the importance of migration.Keywords: migration, variable-environments, random, dispersal, fluctuations, habitat-quality
Procedia PDF Downloads 1391721 Estimating Water Balance at Beterou Watershed, Benin Using Soil and Water Assessment Tool (SWAT) Model
Authors: Ella Sèdé Maforikan
Abstract:
Sustained water management requires quantitative information and the knowledge of spatiotemporal dynamics of hydrological system within the basin. This can be achieved through the research. Several studies have investigated both surface water and groundwater in Beterou catchment. However, there are few published papers on the application of the SWAT modeling in Beterou catchment. The objective of this study was to evaluate the performance of SWAT to simulate the water balance within the watershed. The inputs data consist of digital elevation model, land use maps, soil map, climatic data and discharge records. The model was calibrated and validated using the Sequential Uncertainty Fitting (SUFI2) approach. The calibrated started from 1989 to 2006 with four years warming up period (1985-1988); and validation was from 2007 to 2020. The goodness of the model was assessed using five indices, i.e., Nash–Sutcliffe efficiency (NSE), the ratio of the root means square error to the standard deviation of measured data (RSR), percent bias (PBIAS), the coefficient of determination (R²), and Kling Gupta efficiency (KGE). Results showed that SWAT model successfully simulated river flow in Beterou catchment with NSE = 0.79, R2 = 0.80 and KGE= 0.83 for the calibration process against validation process that provides NSE = 0.78, R2 = 0.78 and KGE= 0.85 using site-based streamflow data. The relative error (PBIAS) ranges from -12.2% to 3.1%. The parameters runoff curve number (CN2), Moist Bulk Density (SOL_BD), Base Flow Alpha Factor (ALPHA_BF), and the available water capacity of the soil layer (SOL_AWC) were the most sensitive parameter. The study provides further research with uncertainty analysis and recommendations for model improvement and provision of an efficient means to improve rainfall and discharges measurement data.Keywords: watershed, water balance, SWAT modeling, Beterou
Procedia PDF Downloads 561720 A Framework for Assessing and Implementing Ecological-Based Adaptation Solutions in Urban Areas of Shanghai
Authors: Xin Li
Abstract:
The uncertainty and the complexity of the urban environment combining with the threat of climate change are contributing factors to the vulnerability in multiple-dimensions in Chinese megacities, especially in Shanghai. The urban area occupied high valuable technological infrastructure and density buildings is under the threats of climate change and can provide insufficient ecological service to remain the trade-off on urban sustainable development. Urban ecological-based adaptation (UEbA) combines practices and theoretical work and integrates ecological services into multiple-layers of urban environment planning in order to reduce the impact of the complexity and uncertainty. To understand and to respond to the challenges in the urban level, this paper considers Shanghai as the research objective. It is necessary that its urban adaptation strategies should be reflected and contain the concept and knowledge of EbA. In this paper, we firstly use software to illustrates the visualizing patterns and trends of UEBA research in the current 10 years. Specifically, Citespace software was used for interpreting the significant hubs, landmarks points of peer-reviewed literature on the context of ecological service research in recent 10 years. Secondly, 135 evidence-based EbA literature were reviewed for categorizing the methodologies and framework of evidence-based EbA by the systematic map protocol. Finally, a conceptual framework combined with culture, economic and social components was developed in order to assess the current adaptation strategies in Shanghai. This research founds that the key to reducing urban vulnerability does not only focus on co-benefit arguments but also should pay more attention to the concept of trade-off. This research concludes that the designed framework can provide key knowledge and indicates the essential gap as a valuable tool against climate variability in the process of urban adaptation in Shanghai.Keywords: urban ecological-based adaptation, climate change, sustainable development, climate variability
Procedia PDF Downloads 1551719 Limitations of Recent National Enactments on International Crimes: The Case of Kenya, Uganda and Sudan
Authors: Emma Charlene Lubaale
Abstract:
The International Criminal Court (ICC) operates based on the principle of complementarity. On the basis of this principle, states enjoy the primary right to prosecute international crimes, with the ICC intervening only when a state with jurisdiction over an international crime is unable or unwilling to prosecute. To ably exercise their primary right to prosecute international crimes domestically, a number of states are taking steps to criminalise international crimes in their national laws. Significant to note, many of the laws enacted are not being applied in the prosecution of the international crimes allegedly committed. Kenya, Uganda and Sudan are some notable states where commission of international crimes is documented. All these states have recently enacted laws on international crimes. Kenya enacted the International Crimes Act in 2008, Uganda enacted the International Criminal Court Act in 2010 and in 2007, Sudan made provision for international crimes under its Armed Forces Act. However, in all these three states, the enacted national laws on international crimes have thus far not featured in any of the proceedings before these states’ courts. Instead, these states have either relied on ordinary crimes to prosecute international crimes or not prosecuted international crimes altogether. This paper underscores the limitations of the enacted laws, explaining why, even with efforts taken by these states to enact national laws on international crimes, these laws cannot be relied on to advance accountability for the international crimes. Notably, the laws in Kenya and Uganda do not have retroactive application. In Sudan, despite the 2007 reforms, the structure of military justice in Sudan has the effect of placing certain categories of individuals beyond the reach of international criminal justice. For Kenya and Uganda, it is concluded that the only benefit that flows from these enactments is reliance on them to prosecute future international crimes. For Sudan, the 2007 reforms will only have the desired impact if reforms are equally made to the structure of military justice.Keywords: complementarity, national laws, Kenya, Sudan, Uganda, international crimes, limitations
Procedia PDF Downloads 2841718 Mercury Contamination of Wetland Caused by Wastewater from Chlor-Alkali Industry
Authors: Mitsuo Yoshida
Abstract:
A significant mercury contamination of soil/sediment was unveiled by an environmental monitoring program in a wetland along La Plata River, west to Montevideo City, Uruguay. The mercury contamination was caused by industrial wastewater discharged from a chlor-alkali plant using a mercury-cell process. The contamination level is above 60 mg/kg in soil/sediment. Most of mercury (Hg) in the environment is inorganic, but some fractions are converted by bacteria to methylmercury (MeHg), a toxic organic compound. MeHg biologically accumulates through a food-chain and become serious public health risk. In order to clarify the contaminated part for countermeasure operation, an intervention value of mercury contamination of sediment/soil was defined as 15 mg/kg (total Hg) by the authority. According to the intervention value, mercury contaminated area in the La Plata site is approximately 48,280 m² and estimated total volume of contaminated sediments/soils was around 18,750 m³. The countermeasures to contaminated zone were proposed in two stages; (i) mitigation of risks for public health and (ii) site remediation. The first stage is an installation of fens and net around the contamination zone, for mitigating risks of exposure, inhalation, and intake. The food chain among wetland-river ecosystem was also interrupted by the installation of net and fens. The state of mercury contamination in La Plata site and plan of countermeasure was disclosed to local people and the public, and consensus on setting off-limit area was successfully achieved. Mass media also contribute to share the information on the contamination site. The cost for countermeasures was borne by the industry under the polluter-pay-principle.Keywords: chlor-alkali plant, mercury contamination, polluter pay principle, Uruguay, wetland
Procedia PDF Downloads 1401717 Fabrication of Uniform Nanofibers Using Gas Dynamic Virtual Nozzle Based Microfluidic Liquid Jet System
Authors: R. Vasireddi, J. Kruse, M. Vakili, M. Trebbin
Abstract:
Here we present a gas dynamic virtual nozzle (GDVN) based microfluidic jetting devices for spinning of nano/microfibers. The device is fabricated by soft lithography techniques and is based on the principle of a GDVN for precise three-dimensional gas focusing of the spinning solution. The nozzle device is used to produce micro/nanofibers of a perfluorinated terpolymer (THV), which were collected on an aluminum substrate for scanning electron microscopy (SEM) analysis. The influences of air pressure, polymer concentration, flow rate and nozzle geometry on the fiber properties were investigated. It was revealed that surface properties are controlled by air pressure and polymer concentration while the diameter and shape of the fibers are influenced mostly by the concentration of the polymer solution and pressure. Alterations of the nozzle geometry had a negligible effect on the fiber properties, however, the jetting stability was affected. Round and flat fibers with differing surface properties from craters, grooves to smooth surfaces could be fabricated by controlling the above-mentioned parameters. Furthermore, the formation of surface roughness was attributed to the fast evaporation rate and velocity (mis)match between the polymer solution jet and the surrounding air stream. The diameter of the fibers could be tuned from ~250 nm to ~15 µm. Because of the simplicity of the setup, the precise control of the fiber properties, access to biocompatible nanofiber fabrication and the easy scale-up of parallel channels for high throughput, this method offers significant benefits compared to existing solution-based fiber production methods.Keywords: gas dynamic virtual nozzle (GDVN) principle, microfluidic device, spinning, uniform nanofibers
Procedia PDF Downloads 1541716 Statistical Correlation between Ply Mechanical Properties of Composite and Its Effect on Structure Reliability
Authors: S. Zhang, L. Zhang, X. Chen
Abstract:
Due to the large uncertainty on the mechanical properties of FRP (fibre reinforced plastic), the reliability evaluation of FRP structures are currently receiving much attention in industry. However, possible statistical correlation between ply mechanical properties has been so far overlooked, and they are mostly assumed to be independent random variables. In this study, the statistical correlation between ply mechanical properties of uni-directional and plain weave composite is firstly analyzed by a combination of Monte-Carlo simulation and finite element modeling of the FRP unit cell. Large linear correlation coefficients between the in-plane mechanical properties are observed, and the correlation coefficients are heavily dependent on the uncertainty of the fibre volume ratio. It is also observed that the correlation coefficients related to Poisson’s ratio are negative while others are positive. To experimentally achieve the statistical correlation coefficients between in-plane mechanical properties of FRP, all concerned in-plane mechanical properties of the same specimen needs to be known. In-plane shear modulus of FRP is experimentally derived by the approach suggested in the ASTM standard D5379M. Tensile tests are conducted using the same specimens used for the shear test, and due to non-uniform tensile deformation a modification factor is derived by a finite element modeling. Digital image correlation is adopted to characterize the specimen non-uniform deformation. The preliminary experimental results show a good agreement with the numerical analysis on the statistical correlation. Then, failure probability of laminate plates is calculated in cases considering and not considering the statistical correlation, using the Monte-Carlo and Markov Chain Monte-Carlo methods, respectively. The results highlight the importance of accounting for the statistical correlation between ply mechanical properties to achieve accurate failure probability of laminate plates. Furthermore, it is found that for the multi-layer laminate plate, the statistical correlation between the ply elastic properties significantly affects the laminate reliability while the effect of statistical correlation between the ply strength is minimal.Keywords: failure probability, FRP, reliability, statistical correlation
Procedia PDF Downloads 1621715 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning
Authors: Jean Berger, Mohamed Barkaoui
Abstract:
Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.Keywords: search path planning, false alarm, search-and-delivery, entropy, genetic algorithm
Procedia PDF Downloads 3601714 Indeterminacy: An Urban Design Tool to Measure Resilience to Climate Change, a Caribbean Case Study
Authors: Tapan Kumar Dhar
Abstract:
How well are our city forms designed to adapt to climate change and its resulting uncertainty? What urban design tools can be used to measure and improve resilience to climate change, and how would they do so? In addressing these questions, this paper considers indeterminacy, a concept originated in the resilience literature, to measure the resilience of built environments. In the realm of urban design, ‘indeterminacy’ can be referred to as built-in design capabilities of an urban system to serve different purposes which are not necessarily predetermined. An urban system, particularly that with a higher degree of indeterminacy, can enable the system to be reorganized and changed to accommodate new or unknown functions while coping with uncertainty over time. Underlying principles of this concept have long been discussed in the urban design and planning literature, including open architecture, landscape urbanism, and flexible housing. This paper argues that the concept indeterminacy holds the potential to reduce the impacts of climate change incrementally and proactively. With regard to sustainable development, both planning and climate change literature highly recommend proactive adaptation as it involves less cost, efforts, and energy than last-minute emergency or reactive actions. Nevertheless, the concept still remains isolated from resilience and climate change adaptation discourses even though the discourses advocate the incremental transformation of a system to cope with climatic uncertainty. This paper considers indeterminacy, as an urban design tool, to measure and increase resilience (and adaptive capacity) of Long Bay’s coastal settlements in Negril, Jamaica. Negril is one of the popular tourism destinations in the Caribbean highly vulnerable to sea-level rise and its associated impacts. This paper employs empirical information obtained from direct observation and informal interviews with local people. While testing the tool, this paper deploys an urban morphology study, which includes land use patterns and the physical characteristics of urban form, including street networks, block patterns, and building footprints. The results reveal that most resorts in Long Bay are designed for pre-determined purposes and offer a little potential to use differently if needed. Additionally, Negril’s street networks are found to be rigid and have limited accessibility to different points of interest. This rigidity can expose the entire infrastructure further to extreme climatic events and also impedes recovery actions after a disaster. However, Long Bay still has room for future resilient developments in other relatively less vulnerable areas. In adapting to climate change, indeterminacy can be reached through design that achieves a balance between the degree of vulnerability and the degree of indeterminacy: the more vulnerable a place is, the more indeterminacy is useful. This paper concludes with a set of urban design typologies to increase the resilience of coastal settlements.Keywords: climate change adaptation, resilience, sea-level rise, urban form
Procedia PDF Downloads 3671713 Reverse Logistics End of Life Products Acquisition and Sorting
Authors: Badli Shah Mohd Yusoff, Khairur Rijal Jamaludin, Rozetta Dollah
Abstract:
The emerging of reverse logistics and product recovery management is an important concept in reconciling economic and environmental objectives through recapturing values of the end of life product returns. End of life products contains valuable modules, parts, residues and materials that can create value if recovered efficiently. The main objective of this study is to explore and develop a model to recover as much of the economic value as reasonably possible to find the optimality of return acquisition and sorting to meet demand and maximize profits over time. In this study, the benefits that can be obtained for remanufacturer is to develop demand forecasting of used products in the future with uncertainty of returns and quality of products. Formulated based on a generic disassembly tree, the proposed model focused on three reverse logistics activity, namely refurbish, remanufacture and disposal incorporating all plausible means quality levels of the returns. While stricter sorting policy, constitute to the decrease amount of products to be refurbished or remanufactured and increases the level of discarded products. Numerical experiments carried out to investigate the characteristics and behaviour of the proposed model with mathematical programming model using Lingo 16.0 for medium-term planning of return acquisition, disassembly (refurbish or remanufacture) and disposal activities. Moreover, the model seeks an analysis a number of decisions relating to trade off management system to maximize revenue from the collection of use products reverse logistics services through refurbish and remanufacture recovery options. The results showed that full utilization in the sorting process leads the system to obtain less quantity from acquisition with minimal overall cost. Further, sensitivity analysis provides a range of possible scenarios to consider in optimizing the overall cost of refurbished and remanufactured products.Keywords: core acquisition, end of life, reverse logistics, quality uncertainty
Procedia PDF Downloads 3051712 Externalised Migration Controls and the Deportation of Minors and Potential Refugees from Mexico
Authors: Vickie Knox
Abstract:
Since the ‘urgent humanitarian crisis’ of the arrival of tens of thousands of Central American minors at the Mexico-US border in early 2014, the USA has increasingly externalised migration controls to Mexico. Although the resulting policy ‘Plan Frontera Sur’ claimed to protect migrants’ human rights, it has manifested as harshly delivered in-country controls and an alarming increase in deportations, particularly of minors. This is of particular concern given the ongoing situation of forced migration caused by criminal violence in Central America because these deportations do not all comply with Mexico’s international obligations and with its own legal framework for international protection that allows inter alia verbal asylum claims and grants minors additional protection against deportation. Notably, the volume of deportations, the speed with which they are carried out and the lack of adequate screening indicate non-compliance with the principle of non-refoulement and the right to claim asylum or other forms of protection. Based on qualitative data gathered in fieldwork in 2015 and quantitative data covering the period 2014-2016, this research details three types of adverse outcome resulting from these externalised controls: human rights violations perpetrated in order to deliver the policy–namely, deportations that may not comply with the principle of non-refoulement or the protection of minors; human rights violations perpetrated in the execution of policy–such as violations by state actors during apprehension and detention; and adverse consequences of the policy – such as increased risk during transit. This research has particular resonance as the Trump era brings tighter enforcement in the region, and has broader relevance for the study of externalisation tools on a global level.Keywords: deportation, externalisation, forced migration, non-refoulement
Procedia PDF Downloads 1511711 Enhancing Project Management Performance in Prefabricated Building Construction under Uncertainty: A Comprehensive Approach
Authors: Niyongabo Elyse
Abstract:
Prefabricated building construction is a pioneering approach that combines design, production, and assembly to attain energy efficiency, environmental sustainability, and economic feasibility. Despite continuous development in the industry in China, the low technical maturity of standardized design, factory production, and construction assembly introduces uncertainties affecting prefabricated component production and on-site assembly processes. This research focuses on enhancing project management performance under uncertainty to help enterprises navigate these challenges and optimize project resources. The study introduces a perspective on how uncertain factors influence the implementation of prefabricated building construction projects. It proposes a theoretical model considering project process management ability, adaptability to uncertain environments, and collaboration ability of project participants. The impact of uncertain factors is demonstrated through case studies and quantitative analysis, revealing constraints on implementation time, cost, quality, and safety. To address uncertainties in prefabricated component production scheduling, a fuzzy model is presented, expressing processing times in interval values. The model utilizes a cooperative co-evolution evolution algorithm (CCEA) to optimize scheduling, demonstrated through a real case study showcasing reduced project duration and minimized effects of processing time disturbances. Additionally, the research addresses on-site assembly construction scheduling, considering the relationship between task processing times and assigned resources. A multi-objective model with fuzzy activity durations is proposed, employing a hybrid cooperative co-evolution evolution algorithm (HCCEA) to optimize project scheduling. Results from real case studies indicate improved project performance in terms of duration, cost, and resilience to processing time delays and resource changes. The study also introduces a multistage dynamic process control model, utilizing IoT technology for real-time monitoring during component production and construction assembly. This approach dynamically adjusts schedules when constraints arise, leading to enhanced project management performance, as demonstrated in a real prefabricated housing project. Key contributions include a fuzzy prefabricated components production scheduling model, a multi-objective multi-mode resource-constrained construction project scheduling model with fuzzy activity durations, a multi-stage dynamic process control model, and a cooperative co-evolution evolution algorithm. The integrated mathematical model addresses the complexity of prefabricated building construction project management, providing a theoretical foundation for practical decision-making in the field.Keywords: prefabricated construction, project management performance, uncertainty, fuzzy scheduling
Procedia PDF Downloads 511710 Investigation of Shear Strength, and Dilative Behavior of Coarse-grained Samples Using Laboratory Test and Machine Learning Technique
Authors: Ehsan Mehryaar, Seyed Armin Motahari Tabari
Abstract:
Coarse-grained soils are known and commonly used in a wide range of geotechnical projects, including high earth dams or embankments for their high shear strength. The most important engineering property of these soils is friction angle which represents the interlocking between soil particles and can be applied widely in designing and constructing these earth structures. Friction angle and dilative behavior of coarse-grained soils can be estimated from empirical correlations with in-situ testing and physical properties of the soil or measured directly in the laboratory performing direct shear or triaxial tests. Unfortunately, large-scale testing is difficult, challenging, and expensive and is not possible in most soil mechanic laboratories. So, it is common to remove the large particles and do the tests, which cannot be counted as an exact estimation of the parameters and behavior of the original soil. This paper describes a new methodology to simulate particles grading distribution of a well-graded gravel sample to a smaller scale sample as it can be tested in an ordinary direct shear apparatus to estimate the stress-strain behavior, friction angle, and dilative behavior of the original coarse-grained soil considering its confining pressure, and relative density using a machine learning method. A total number of 72 direct shear tests are performed in 6 different sizes, 3 different confining pressures, and 4 different relative densities. Multivariate Adaptive Regression Spline (MARS) technique was used to develop an equation in order to predict shear strength and dilative behavior based on the size distribution of coarse-grained soil particles. Also, an uncertainty analysis was performed in order to examine the reliability of the proposed equation.Keywords: MARS, coarse-grained soil, shear strength, uncertainty analysis
Procedia PDF Downloads 1621709 Uncertainty Quantification of Crack Widths and Crack Spacing in Reinforced Concrete
Authors: Marcel Meinhardt, Manfred Keuser, Thomas Braml
Abstract:
Cracking of reinforced concrete is a complex phenomenon induced by direct loads or restraints affecting reinforced concrete structures as soon as the tensile strength of the concrete is exceeded. Hence it is important to predict where cracks will be located and how they will propagate. The bond theory and the crack formulas in the actual design codes, for example, DIN EN 1992-1-1, are all based on the assumption that the reinforcement bars are embedded in homogeneous concrete without taking into account the influence of transverse reinforcement and the real stress situation. However, it can often be observed that real structures such as walls, slabs or beams show a crack spacing that is orientated to the transverse reinforcement bars or to the stirrups. In most Finite Element Analysis studies, the smeared crack approach is used for crack prediction. The disadvantage of this model is that the typical strain localization of a crack on element level can’t be seen. The crack propagation in concrete is a discontinuous process characterized by different factors such as the initial random distribution of defects or the scatter of material properties. Such behavior presupposes the elaboration of adequate models and methods of simulation because traditional mechanical approaches deal mainly with average material parameters. This paper concerned with the modelling of the initiation and the propagation of cracks in reinforced concrete structures considering the influence of transverse reinforcement and the real stress distribution in reinforced concrete (R/C) beams/plates in bending action. Therefore, a parameter study was carried out to investigate: (I) the influence of the transversal reinforcement to the stress distribution in concrete in bending mode and (II) the crack initiation in dependence of the diameter and distance of the transversal reinforcement to each other. The numerical investigations on the crack initiation and propagation were carried out with a 2D reinforced concrete structure subjected to quasi static loading and given boundary conditions. To model the uncertainty in the tensile strength of concrete in the Finite Element Analysis correlated normally and lognormally distributed random filed with different correlation lengths were generated. The paper also presents and discuss different methods to generate random fields, e.g. the Covariance Matrix Decomposition Method. For all computations, a plastic constitutive law with softening was used to model the crack initiation and the damage of the concrete in tension. It was found that the distributions of crack spacing and crack widths are highly dependent of the used random field. These distributions are validated to experimental studies on R/C panels which were carried out at the Laboratory for Structural Engineering at the University of the German Armed Forces in Munich. Also, a recommendation for parameters of the random field for realistic modelling the uncertainty of the tensile strength is given. The aim of this research was to show a method in which the localization of strains and cracks as well as the influence of transverse reinforcement on the crack initiation and propagation in Finite Element Analysis can be seen.Keywords: crack initiation, crack modelling, crack propagation, cracks, numerical simulation, random fields, reinforced concrete, stochastic
Procedia PDF Downloads 1581708 Study of Structural Behavior and Proton Conductivity of Inorganic Gel Paste Electrolyte at Various Phosphorous to Silicon Ratio by Multiscale Modelling
Authors: P. Haldar, P. Ghosh, S. Ghoshdastidar, K. Kargupta
Abstract:
In polymer electrolyte membrane fuel cells (PEMFC), the membrane electrode assembly (MEA) is consisting of two platinum coated carbon electrodes, sandwiched with one proton conducting phosphoric acid doped polymeric membrane. Due to low mechanical stability, flooding and fuel cell crossover, application of phosphoric acid in polymeric membrane is very critical. Phosphorous and silica based 3D inorganic gel gains the attention in the field of supercapacitors, fuel cells and metal hydrate batteries due to its thermally stable highly proton conductive behavior. Also as a large amount of water molecule and phosphoric acid can easily get trapped in Si-O-Si network cavities, it causes a prevention in the leaching out. In this study, we have performed molecular dynamics (MD) simulation and first principle calculations to understand the structural, electronics and electrochemical and morphological behavior of this inorganic gel at various P to Si ratios. We have used dipole-dipole interactions, H bonding, and van der Waals forces to study the main interactions between the molecules. A 'structure property-performance' mapping is initiated to determine optimum P to Si ratio for best proton conductivity. We have performed the MD simulations at various temperature to understand the temperature dependency on proton conductivity. The observed results will propose a model which fits well with experimental data and other literature values. We have also studied the mechanism behind proton conductivity. And finally we have proposed a structure for the gel paste with optimum P to Si ratio.Keywords: first principle calculation, molecular dynamics simulation, phosphorous and silica based 3D inorganic gel, polymer electrolyte membrane fuel cells, proton conductivity
Procedia PDF Downloads 1291707 Design and Optimization of an Electromagnetic Vibration Energy Converter
Authors: Slim Naifar, Sonia Bradai, Christian Viehweger, Olfa Kanoun
Abstract:
Vibration provides an interesting source of energy since it is available in many indoor and outdoor applications. Nevertheless, in order to have an efficient design of the harvesting system, vibration converters have to satisfy some criterion in terms of robustness, compactness and energy outcome. In this work, an electromagnetic converter based on mechanical spring principle is proposed. The designed harvester is formed by a coil oscillating around ten ring magnets using a mechanical spring. The proposed design overcomes one of the main limitation of the moving coil by avoiding the contact between the coil wires with the mechanical spring which leads to a better robustness for the converter. In addition, the whole system can be implemented in a cavity of a screw. Different parameters in the harvester were investigated by finite element method including the magnet size, the coil winding number and diameter and the excitation frequency and amplitude. A prototype was realized and tested. Experiments were performed for 0.5 g to 1 g acceleration. The used experimental setup consists of an electrodynamic shaker as an external artificial vibration source controlled by a laser sensor to measure the applied displacement and frequency excitation. Together with the laser sensor, a controller unit, and an amplifier, the shaker is operated in a closed loop which allows controlling the vibration amplitude. The resonance frequency of the proposed designs is in the range of 24 Hz. Results indicate that the harvester can generate 612 mV and 1150 mV maximum open circuit peak to peak voltage at resonance for 0.5 g and 1 g acceleration respectively which correspond to 4.75 mW and 1.34 mW output power. Tuning the frequency to other values is also possible due to the possibility to add mass to the moving part of the or by changing the mechanical spring stiffness.Keywords: energy harvesting, electromagnetic principle, vibration converter, moving coil
Procedia PDF Downloads 2981706 Business Strategy, Crisis and Digitalization
Authors: Flora Xu, Marta Fernandez Olmos
Abstract:
This article is mainly about critical assessment and comprehensive understanding of the business strategy in the post COVID-19 scenario. This study aims to elucidate how companies are responding to the unique challenges posed by the pandemic and how these measures are shaping the future of the business environment. The pandemic has exposed the fragility and flexibility of the global supply chain, and procurement and production strategies should be reconsidered. It should increase the diversity of suppliers and the flexibility of the supply chain, and some companies are considering transferring their survival to the local market. This can increase local employment and reduce international transportation disruptions and customs issues. By shortening the distance between production and market, companies can respond more quickly to changes in demand and unforeseen events. The demand for remote work and online solutions will increase the adoption of digital technology and accelerate the digital transformation of many organizations. Marketing and communication strategies need to adapt to a constantly changing environment. The business resilience strategy was emphasized as a key component of the response to the COVID-19. The company is seeking to strengthen its risk management capabilities and develop a business continuity plan to cope with future unexpected disruptions. The pandemic has reconfigured human resource practices and changed the way companies manage their employees. Remote work has become the norm, and companies focus on managing workers' health and well-being, as well as flexible work policies to ensure operations and support for employees during crises. This change in human resources practice has a lasting impact on how companies apply talent and labor management in the post COVID-19 world. The pandemic has prompted a significant review of business strategies as companies adapt to constantly changing environments and seek to ensure their sustainability and profitability in times of crisis. This strategic reassessment has led to product diversification, exploring international markets and adapting to the changing market. Companies have responded to the unprecedented challenges brought by the COVID-19. The COVID-19 has promoted innovation effort in key areas and focused on the responsibility in today's business strategy for sustainability and the importance of corporate society. The important challenge of formulating and implementing business strategies in uncertain times. These challenges include making quick and agile decisions in turbulent environments, risk management, and adaptability to constantly changing market conditions. The COVID-19 highlights the importance of strategic planning and informed decision-making - making in a business environment characterized by uncertainty and complexity. In short, the pandemic has reconfigured the way companies handle business strategies and emphasized the necessity of preparing for future challenges in a business world marked by uncertainty and complexity.Keywords: business strategy, crisis, digitalization, uncertainty
Procedia PDF Downloads 201705 Political Economy and Human Rights Engaging in Conversation
Authors: Manuel Branco
Abstract:
This paper argues that mainstream economics is one of the reasons that can explain the difficulty in fully realizing human rights because its logic is intrinsically contradictory to human rights, most especially economic, social and cultural rights. First, its utilitarianism, both in its cardinal and ordinal understanding, contradicts human rights principles. Maximizing aggregate utility along the lines of cardinal utility is a theoretical exercise that consists in ensuring as much as possible that gains outweigh losses in society. In this process an individual may get worse off, though. If mainstream logic is comfortable with this, human rights' logic does not. Indeed, universality is a key principle in human rights and for this reason the maximization exercise should aim at satisfying all citizens’ requests when goods and services necessary to secure human rights are at stake. The ordinal version of utilitarianism, in turn, contradicts the human rights principle of indivisibility. Contrary to ordinal utility theory that ranks baskets of goods, human rights do not accept ranking when these goods and services are necessary to secure human rights. Second, by relying preferably on market logic to allocate goods and services, mainstream economics contradicts human rights because the intermediation of money prices and the purpose of profit may cause exclusion, thus compromising the principle of universality. Finally, mainstream economics sees human rights mainly as constraints to the development of its logic. According to this view securing human rights would, then, be considered a cost weighing on economic efficiency and, therefore, something to be minimized. Fully realizing human rights needs, therefore, a different approach. This paper discusses a human rights-based political economy. This political economy, among other characteristics should give up mainstream economics narrow utilitarian approach, give up its belief that market logic should guide all exchanges of goods and services between human beings, and finally give up its view of human rights as constraints on rational choice and consequently on good economic performance. Giving up mainstream’s narrow utilitarian approach means, first embracing procedural utility and human rights-aimed consequentialism. Second, a more radical break can be imagined; non-utilitarian, or even anti-utilitarian, approaches may emerge, then, as alternatives, these two standpoints being not necessarily mutually exclusive, though. Giving up market exclusivity means embracing decommodification. More specifically, this means an approach that takes into consideration the value produced outside the market and an allocation process no longer necessarily centered on money prices. Giving up the view of human rights as constraints means, finally, to consider human rights as an expression of wellbeing and a manifestation of choice. This means, in turn, an approach that uses indicators of economic performance other than growth at the macro level and profit at the micro level, because what we measure affects what we do.Keywords: economic and social rights, political economy, economic theory, markets
Procedia PDF Downloads 1531704 Hydrological Analysis for Urban Water Management
Authors: Ranjit Kumar Sahu, Ramakar Jha
Abstract:
Urban Water Management is the practice of managing freshwater, waste water, and storm water as components of a basin-wide management plan. It builds on existing water supply and sanitation considerations within an urban settlement by incorporating urban water management within the scope of the entire river basin. The pervasive problems generated by urban development have prompted, in the present work, to study the spatial extent of urbanization in Golden Triangle of Odisha connecting the cities Bhubaneswar (20.2700° N, 85.8400° E), Puri (19.8106° N, 85.8314° E) and Konark (19.9000° N, 86.1200° E)., and patterns of periodic changes in urban development (systematic/random) in order to develop future plans for (i) urbanization promotion areas, and (ii) urbanization control areas. Remote Sensing, using USGS (U.S. Geological Survey) Landsat8 maps, supervised classification of the Urban Sprawl has been done for during 1980 - 2014, specifically after 2000. This Work presents the following: (i) Time series analysis of Hydrological data (ground water and rainfall), (ii) Application of SWMM (Storm Water Management Model) and other soft computing techniques for Urban Water Management, and (iii) Uncertainty analysis of model parameters (Urban Sprawl and correlation analysis). The outcome of the study shows drastic growth results in urbanization and depletion of ground water levels in the area that has been discussed briefly. Other relative outcomes like declining trend of rainfall and rise of sand mining in local vicinity has been also discussed. Research on this kind of work will (i) improve water supply and consumption efficiency (ii) Upgrade drinking water quality and waste water treatment (iii) Increase economic efficiency of services to sustain operations and investments for water, waste water, and storm water management, and (iv) engage communities to reflect their needs and knowledge for water management.Keywords: Storm Water Management Model (SWMM), uncertainty analysis, urban sprawl, land use change
Procedia PDF Downloads 4271703 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines
Authors: Alexander Guzman Urbina, Atsushi Aoyama
Abstract:
The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.Keywords: deep learning, risk assessment, neuro fuzzy, pipelines
Procedia PDF Downloads 2921702 Influence of Ammonia Emissions on Aerosol Formation in Northern and Central Europe
Authors: A. Aulinger, A. M. Backes, J. Bieser, V. Matthias, M. Quante
Abstract:
High concentrations of particles pose a threat to human health. Thus, legal maximum concentrations of PM10 and PM2.5 in ambient air have been steadily decreased over the years. In central Europe, the inorganic species ammonium sulphate and ammonium nitrate make up a large fraction of fine particles. Many studies investigate the influence of emission reductions of sulfur- and nitrogen oxides on aerosol concentration. Here, we focus on the influence of ammonia (NH3) emissions. While emissions of sulphate and nitrogen oxides are quite well known, ammonia emissions are subject to high uncertainty. This is due to the uncertainty of location, amount, time of fertilizer application in agriculture, and the storage and treatment of manure from animal husbandry. For this study, we implemented a crop growth model into the SMOKE emission model. Depending on temperature, local legislation, and crop type individual temporal profiles for fertilizer and manure application are calculated for each model grid cell. Additionally, the diffusion from soils and plants and the direct release from open and closed barns are determined. The emission data was used as input for the Community Multiscale Air Quality (CMAQ) model. Comparisons to observations from the EMEP measurement network indicate that the new ammonia emission module leads to a better agreement of model and observation (for both ammonia and ammonium). Finally, the ammonia emission model was used to create emission scenarios. This includes emissions based on future European legislation, as well as a dynamic evaluation of the influence of different agricultural sectors on particle formation. It was found that a reduction of ammonia emissions by 50% lead to a 24% reduction of total PM2.5 concentrations during winter time in the model domain. The observed reduction was mainly driven by reduced formation of ammonium nitrate. Moreover, emission reductions during winter had a larger impact than during the rest of the year.Keywords: ammonia, ammonia abatement strategies, ctm, seasonal impact, secondary aerosol formation
Procedia PDF Downloads 3511701 Use of Corporate Social Responsibility in Environmental Protection: Modern Mechanisms of Environmental Self-Regulation
Authors: Jakub Stelina, Janina Ciechanowicz-McLean
Abstract:
Fifty years of existence and development of international environmental law brought a deep disappointment with efficiency and effectiveness of traditional command and control mechanisms of environmental regulation. Agenda 21 agreed during the first Earth Summit in Rio de Janeiro 1992 was one of the first international documents, which explicitly underlined the importance of public participation in environmental protection. This participation includes also the initiatives undertaken by business corporations in the form of private environmental standards setting. Twenty years later during the Rio 20+ Earth Summit the private sector obligations undertaken during the negotiations have proven to be at least as important as the ones undertaken by the governments. The private sector has taken the leading role in environmental standard setting. Among the research methods used in the article two are crucial in the analysis. The comparative analysis of law is the instrument used in the article to analyse the practice of states and private business companies in the field of sustainable development. The article uses economic analysis of law to estimate the costs and benefits of Corporate Social Responsibility Projects in the field of environmental protection. The study is based on the four premises. First is the role of social dialogue, which is crucial for both Corporate Social Responsibility and modern environmental protection regulation. The Aarhus Convention creates a procedural environmental human right to participate in administrative procedures of law setting and environmental decisions making. The public participation in environmental impact assessment is nowadays a universal standard. Second argument is about the role of precaution as a principle of modern environmental regulation. This principle can be observed both in governmental regulatory undertakings and also private initiatives within the Corporate Social Responsibility environmental projects. Even in the jurisdictions which are relatively reluctant to use the principle of preventive action in environmental regulation, the companies often use this standard in their own private business standard setting initiatives. This is often due to the fact that soft law standards are used as the basis for private Corporate Social Responsibility regulatory initiatives. Third premise is about the role of ecological education in environmental protection. Many soft law instruments underline the importance of environmental education. Governments use environmental education only to the limited extent due to the costs of such projects and problems with effects assessment. Corporate Social Responsibility uses various means of ecological education as the basis of their actions in the field of environmental protection. Last but not least Sustainable development is a goal of both legal protection of the environment, and economic instruments of companies development. Modern environmental protection law uses to the increasing extent the Corporate Social Responsibility. This may be the consequence of the limits of hard law regulation. Corporate Social Responsibility is nowadays not only adapting to soft law regulation of environmental protection but also creates such standards by itself, showing new direction for development of international environmental law. Corporate Social Responsibility in environmental protection can be good investment in future development of the company.Keywords: corporate social responsibility, environmental CSR, environmental justice, stakeholders dialogue
Procedia PDF Downloads 3011700 Analysis of Unconditional Conservatism and Earnings Quality before and after the IFRS Adoption
Authors: Monica Santi, Evita Puspitasari
Abstract:
International Financial Reporting Standard (IFRS) has developed the principle based accounting standard. Based on this, IASB then eliminated the conservatism concept within accounting framework. Conservatism concept represents a prudent reaction to uncertainty to try to ensure that uncertainties and risk inherent in business situations are adequately considered. The conservatism concept has two ingredients: conditional conservatism or ex-post (news depending prudence) and unconditional conservatism or ex-ante (news-independent prudence). IFRS in substance disregards the unconditional conservatism because the unconditional conservatism can cause the understatement assets or overstated liabilities, and eventually the financial statement would be irrelevance since the information does not represent the real fact. Therefore, the IASB eliminate the conservatism concept. However, it does not decrease the practice of unconditional conservatism in the financial statement reporting. Therefore, we expected the earnings quality would be affected because of this situation, even though the IFRS implementation was expected to increase the earnings quality. The objective of this study was to provide empirical findings about the unconditional conservatism and the earnings quality before and after the IFRS adoption. The earnings per accrual measure were used as the proxy for the unconditional conservatism. If the earnings per accrual were negative (positive), it meant the company was classified as the conservative (not conservative). The earnings quality was defined as the ability of the earnings in reflecting the future earnings by considering the earnings persistence and stability. We used the earnings response coefficient (ERC) as the proxy for the earnings quality. ERC measured the extant of a security’s abnormal market return in response to the unexpected component of reporting earning of the firm issuing that security. The higher ERC indicated the higher earnings quality. The manufacturing companies listed in the Indonesian Stock Exchange (IDX) were used as the sample companies, and the 2009-2010 period was used to represent the condition before the IFRS adoption, and 2011-2013 was used to represent the condition after the IFRS adoption. Data was analyzed using the Mann-Whitney test and regression analysis. We used the firm size as the control variable with the consideration the firm size would affect the earnings quality of the company. This study had proved that the unconditional conservatism had not changed, either before and after the IFRS adoption period. However, we found the different findings for the earnings quality. The earnings quality had decreased after the IFRS adoption period. This empirical results implied that the earnings quality before the IFRS adoption was higher. This study also had found that the unconditional conservatism positively influenced the earnings quality insignificantly. The findings implied that the implementation of the IFRS had not decreased the unconditional conservatism practice and has not altered the earnings quality of the manufacturing company. Further, we found that the unconditional conservatism did not affect the earnings quality. Eventhough the empirical result shows that the unconditional conservatism gave positive influence to the earnings quality, but the influence was not significant. Thus, we concluded that the implementation of the IFRS did not increase the earnings quality.Keywords: earnings quality, earnings response coefficient, IFRS Adoption, unconditional conservatism
Procedia PDF Downloads 2611699 Dislocation and Writing: A Process of Remaking Identity
Authors: Hasti Abbasi
Abstract:
Creative writers have long followed the tradition of romantic exile, looking inward in an attempt to construct new viewpoints through the power of imagination. The writer, who attempts to resist uncertainty and locate her place in the new country through writing, resists creativity itself. For a writer, certain satisfaction can be achieved through producing a creative art away from the anxiety of the sense of dislocation. Dislocation, whether enforced or self-inflicted, could in many ways be a disaster but it could also cultivate a greater creative capacity and be a source of creative expression. This paper will investigate the idea of the creative writer as exiled self through reflections on the relationship between dislocation and writing.Keywords: dislocation, creative writing, remaking identity, exile literature
Procedia PDF Downloads 2911698 Agent-Based Modeling Investigating Self-Organization in Open, Non-equilibrium Thermodynamic Systems
Authors: Georgi Y. Georgiev, Matthew Brouillet
Abstract:
This research applies the power of agent-based modeling to a pivotal question at the intersection of biology, computer science, physics, and complex systems theory about the self-organization processes in open, complex, non-equilibrium thermodynamic systems. Central to this investigation is the principle of Maximum Entropy Production (MEP). This principle suggests that such systems evolve toward states that optimize entropy production, leading to the formation of structured environments. It is hypothesized that guided by the least action principle, open thermodynamic systems identify and follow the shortest paths to transmit energy and matter, resulting in maximal entropy production, internal structure formation, and a decrease in internal entropy. Concurrently, it is predicted that there will be an increase in system information as more information is required to describe the developing structure. To test this, an agent-based model is developed simulating an ant colony's formation of a path between a food source and its nest. Utilizing the Netlogo software for modeling and Python for data analysis and visualization, self-organization is quantified by calculating the decrease in system entropy based on the potential states and distribution of the ants within the simulated environment. External entropy production is also evaluated for information increase and efficiency improvements in the system's action. Simulations demonstrated that the system begins at maximal entropy, which decreases as the ants form paths over time. A range of system behaviors contingent upon the number of ants are observed. Notably, no path formation occurred with fewer than five ants, whereas clear paths were established by 200 ants, and saturation of path formation and entropy state was reached at populations exceeding 1000 ants. This analytical approach identified the inflection point marking the transition from disorder to order and computed the slope at this point. Combined with extrapolation to the final path entropy, these parameters yield important insights into the eventual entropy state of the system and the timeframe for its establishment, enabling the estimation of the self-organization rate. This study provides a novel perspective on the exploration of self-organization in thermodynamic systems, establishing a correlation between internal entropy decrease rate and external entropy production rate. Moreover, it presents a flexible framework for assessing the impact of external factors like changes in world size, path obstacles, and friction. Overall, this research offers a robust, replicable model for studying self-organization processes in any open thermodynamic system. As such, it provides a foundation for further in-depth exploration of the complex behaviors of these systems and contributes to the development of more efficient self-organizing systems across various scientific fields.Keywords: complexity, self-organization, agent based modelling, efficiency
Procedia PDF Downloads 691697 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference
Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade
Abstract:
In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory
Procedia PDF Downloads 901696 Fire Safety Assessment of At-Risk Groups
Authors: Naser Kazemi Eilaki, Carolyn Ahmer, Ilona Heldal, Bjarne Christian Hagen
Abstract:
Older people and people with disabilities are recognized as at-risk groups when it comes to egress and travel from hazard zone to safe places. One's disability can negatively influence her or his escape time, and this becomes even more important when people from this target group live alone. This research deals with the fire safety of mentioned people's buildings by means of probabilistic methods. For this purpose, fire safety is addressed by modeling the egress of our target group from a hazardous zone to a safe zone. A common type of detached house with a prevalent plan has been chosen for safety analysis, and a limit state function has been developed according to the time-line evacuation model, which is based on a two-zone and smoke development model. An analytical computer model (B-Risk) is used to consider smoke development. Since most of the involved parameters in the fire development model pose uncertainty, an appropriate probability distribution function has been considered for each one of the variables with indeterministic nature. To achieve safety and reliability for the at-risk groups, the fire safety index method has been chosen to define the probability of failure (causalities) and safety index (beta index). An improved harmony search meta-heuristic optimization algorithm has been used to define the beta index. Sensitivity analysis has been done to define the most important and effective parameters for the fire safety of the at-risk group. Results showed an area of openings and intervals to egress exits are more important in buildings, and the safety of people would improve with increasing dimensions of occupant space (building). Fire growth is more critical compared to other parameters in the home without a detector and fire distinguishing system, but in a home equipped with these facilities, it is less important. Type of disabilities has a great effect on the safety level of people who live in the same home layout, and people with visual impairment encounter more risk of capturing compared to visual and movement disabilities.Keywords: fire safety, at-risk groups, zone model, egress time, uncertainty
Procedia PDF Downloads 1041695 The Implications of Technological Advancements on the Constitutional Principles of Contract Law
Authors: Laura Çami (Vorpsi), Xhon Skënderi
Abstract:
In today's rapidly evolving technological landscape, the traditional principles of contract law are facing significant challenges. The emergence of new technologies, such as electronic signatures, smart contracts, and online dispute resolution mechanisms, is transforming the way contracts are formed, interpreted, and enforced. This paper examines the implications of these technological advancements on the constitutional principles of contract law. One of the fundamental principles of contract law is freedom of contract, which ensures that parties have the autonomy to negotiate and enter into contracts as they see fit. However, the use of technology in the contracting process has the potential to disrupt this principle. For example, online platforms and marketplaces often offer standard-form contracts, which may not reflect the specific needs or interests of individual parties. This raises questions about the equality of bargaining power between parties and the extent to which parties are truly free to negotiate the terms of their contracts. Another important principle of contract law is the requirement of consideration, which requires that each party receives something of value in exchange for their promise. The use of digital assets, such as cryptocurrencies, has created new challenges in determining what constitutes valuable consideration in a contract. Due to the ambiguity in this area, disagreements about the legality and enforceability of such contracts may arise. Furthermore, the use of technology in dispute resolution mechanisms, such as online arbitration and mediation, may raise concerns about due process and access to justice. The use of algorithms and artificial intelligence to determine the outcome of disputes may also raise questions about the impartiality and fairness of the process. Finally, it should be noted that there are many different and complex effects of technical improvements on the fundamental constitutional foundations of contract law. As technology continues to evolve, it will be important for policymakers and legal practitioners to consider the potential impacts on contract law and to ensure that the principles of fairness, equality, and access to justice are preserved in the contracting process.Keywords: technological advancements, constitutional principles, contract law, smart contracts, online dispute resolution, freedom of contract
Procedia PDF Downloads 1521694 Existence Solutions for Three Point Boundary Value Problem for Differential Equations
Authors: Mohamed Houas, Maamar Benbachir
Abstract:
In this paper, under weak assumptions, we study the existence and uniqueness of solutions for a nonlinear fractional boundary value problem. New existence and uniqueness results are established using Banach contraction principle. Other existence results are obtained using scheafer and krasnoselskii's fixed point theorem. At the end, some illustrative examples are presented.Keywords: caputo derivative, boundary value problem, fixed point theorem, local conditions
Procedia PDF Downloads 4301693 Coping with the Stress and Negative Emotions of Care-Giving by Using Techniques from Seneca, Epictetus, and Marcus Aurelius
Authors: Arsalan Memon
Abstract:
There are many challenges that a caregiver faces in average everyday life. One such challenge is coping with the stress and negative emotions of caregiving. The Stoics (i.e. Lucius Annaeus Seneca [4 B.C.E. - 65 C.E.], Epictetus [50-135 C.E.], and Marcus Aurelius [121-180 C.E.]) have provided coping techniques that are useful for dealing with stress and negative emotions. This paper lists and explains some of the fundamental coping techniques provided by the Stoics. For instance, some Stoic coping techniques thus follow (the list is far from exhaustive): a) mindfulness: to the best of your ability, constantly being aware of your thoughts, habits, desires, norms, memories, likes/dislikes, beliefs, values, and of everything outside of you in the world (b) constantly adjusting one’s expectations in accordance with reality, c) memento mori: constantly reminding oneself that death is inevitable and that death is not to be seen as evil, and d) praemeditatio malorum: constantly detaching oneself from everything that is so dear to one so that the least amount of suffering follows from the loss, damage, or ceasing to be of such entities. All coping techniques will be extracted from the following original texts by the Stoics: Seneca’s Letters to Lucilius, Epictetus’ Discourses and the Encheiridion, and Marcus Aurelius’ Meditations. One major finding is that the usefulness of each Stoic coping technique can be empirically tested by anyone in the sense of applying it one’s own life especially when one is facing real-life challenges. Another major finding is that all of the Stoic coping techniques are predicated upon, and follow from, one fundamental principle: constantly differentiate what is and what is not in one’s control. After differentiating it, one should constantly habituate oneself in not controlling things that are beyond one’s control. For example, the following things are beyond one’s control (all things being equal): death, certain illnesses, being born in a particular socio-economic family, etc. The conclusion is that if one habituates oneself by practicing to the best of one’s ability both the fundamental Stoic principle and the Stoic coping techniques, then such a habitual practice can eventually decrease the stress and negative emotions that one experiences by being a caregiver.Keywords: care-giving, coping techniques, negative emotions, stoicism, stress
Procedia PDF Downloads 141