Search results for: tax complexity
434 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial
Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs
Abstract:
Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation
Procedia PDF Downloads 120433 Advancing Urban Sustainability through the Integration of Planning Evaluation Methodologies
Authors: Natalie Rosales
Abstract:
Based on an ethical vision which recognizes the vital role of human rights, shared values, social responsibility and justice, and environmental ethics, planning may be interpreted as a process aimed at reducing inequalities and overcoming marginality. Seen from this sustainability perspective, planning evaluation must utilize critical-evaluative and narrative receptive models which assist different stakeholders in their understanding of urban fabric while trigger reflexive processes that catalyze wider transformations. In this paper, this approach servers as a guide for the evaluation of Mexico´s urban planning systems, and postulates a framework to better integrate sustainability notions into planning evaluation. The paper is introduced by an overview of the current debate on evaluation in urban planning. The state of art presented includes: the different perspectives and paradigms of planning evaluation and their fundamentals and scope, which have focused on three main aspects; goal attainment (did planning instruments do what they were supposed to?); performance and effectiveness of planning (retrospective analysis of planning process and policy analysis assessment); and the effects of process-considering decision problems and contexts rather than the techniques and methods. As well as, methodological innovations and improvements in planning evaluation. This comprehensive literature review provides the background to support the authors’ proposal for a set of general principles to evaluate urban planning, grounded on a sustainability perspective. In the second part the description of the shortcomings of the approaches to evaluate urban planning in Mexico set the basis for highlighting the need of regulatory and instrumental– but also explorative- and collaborative approaches. As a response to the inability of these isolated methods to capture planning complexity and strengthen the usefulness of evaluation process to improve the coherence and internal consistency of the planning practice itself. In the third section the general proposal to evaluate planning is described in its main aspects. It presents an innovative methodology for establishing a more holistic and integrated assessment which considers the interdependence between values, levels, roles and methods, and incorporates different stakeholders in the evaluation process. By doing so, this piece of work sheds light on how to advance urban sustainability through the integration of evaluation methodologies into planning.Keywords: urban planning, evaluation methodologies, urban sustainability, innovative approaches
Procedia PDF Downloads 473432 An Investigation on Interactions between Social Security with Police Operation and Economics in the Field of Tourism
Authors: Mohammad Mahdi Namdari, Hosein Torki
Abstract:
Security as an abstract concept, has involved human being from the beginning of creation to the present, and certainly to the future. Accordingly, battles, conflicts, challenges, legal proceedings, crimes and all issues related to human kind are associated with this concept. Today by interviewing people about their life, the security of societies and Social crimes are interviewed too. Along with the security as an infrastructure and vital concept, the economy and related issues e.g. welfare, per capita income, total government revenue, export, import and etc. is considered another infrastructure and vital concept. These two vital concepts (Security and Economic) have linked together complexly and significantly. The present study employs analytical-descriptive research method using documents and Statistics of official sources. Discovery and explanation of this mutual connection are comprising a profound and extensive research; so management, development and reform in system and relationships of the scope of this two concepts are complex and difficult. Tourism and its position in today's economy is one of the main pillars of the economy of the 21st century that maybe associate with the security and social crimes more than other pillars. Like all human activities, economy of societies and partially tourism dependent on security especially in the public and social security. On the other hand, the true economic development (generally) and the growth of the tourism industry (dedicated) are a security generating and supporting for it, because a dynamic economic infrastructure prevents the formation of centers of crime and illegal activities by providing a context for socio-economic development for all segments of society in a fair and humane. This relationship is a formula of the complexity between the two concept of economy and security. Police as a revealed or people-oriented organization in the field of security directly has linked with the economy of a community and is very effective In the face of the tourism industry. The relationship between security and national crime index, and economic indicators especially ones related to tourism is confirming above discussion that is notable. According to understanding processes about security and economic as two key and vital concepts are necessary and significant for sovereignty of governments.Keywords: economic, police, tourism, social security
Procedia PDF Downloads 321431 Synthesis and Two-Photon Polymerization of a Cytocompatibility Tyramine Functionalized Hyaluronic Acid Hydrogel That Mimics the Chemical, Mechanical, and Structural Characteristics of Spinal Cord Tissue
Authors: James Britton, Vijaya Krishna, Manus Biggs, Abhay Pandit
Abstract:
Regeneration of the spinal cord after injury remains a great challenge due to the complexity of this organ. Inflammation and gliosis at the injury site hinder the outgrowth of axons and hence prevent synaptic reconnection and reinnervation. Hyaluronic acid (HA) is the main component of the spinal cord extracellular matrix and plays a vital role in cell proliferation and axonal guidance. In this study, we have synthesized and characterized a photo-cross-linkable HA-tyramine (tyr) hydrogel from a chemical, mechanical, electrical, biological and structural perspective. From our experimentation, we have found that HA-tyr can be synthesized with controllable degrees of tyramine substitution using click chemistry. The complex modulus (G*) of HA-tyr can be tuned to mimic the mechanical properties of the native spinal cord via optimization of the photo-initiator concentration and UV exposure. We have examined the degree of tyramine-tyramine covalent bonding (polymerization) as a function of UV exposure and photo-initiator use via Photo and Nuclear magnetic resonance spectroscopy. Both swelling and enzymatic degradation assays were conducted to examine the resilience of our 3D printed hydrogel constructs in-vitro. Using a femtosecond 780nm laser, the two-photon polymerization of HA-tyr hydrogel in the presence of riboflavin photoinitiator was optimized. A laser power of 50mW and scan speed of 30,000 μm/s produced high-resolution spatial patterning within the hydrogel with sustained mechanical integrity. Using dorsal root ganglion explants, the cytocompatibility of photo-crosslinked HA-tyr was assessed. Using potentiometry, the electrical conductivity of photo-crosslinked HA-tyr was assessed and compared to that of native spinal cord tissue as a function of frequency. In conclusion, we have developed a biocompatible hydrogel that can be used for photolithographic 3D printing to fabricate tissue engineered constructs for neural tissue regeneration applications.Keywords: 3D printing, hyaluronic acid, photolithography, spinal cord injury
Procedia PDF Downloads 151430 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 146429 Literature Review and Approach for the Use of Digital Factory Models in an Augmented Reality Application for Decision Making in Restructuring Processes
Authors: Rene Hellmuth, Jorg Frohnmayer
Abstract:
The requirements of the factory planning and the building concerned have changed in the last years. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring gains more importance in order to maintain the competitiveness of a factory. Even today, the methods and process models used in factory planning are predominantly based on the classical planning principles of Schmigalla, Aggteleky and Kettner, which, however, are not specifically designed for reorganization. In addition, they are designed for a largely static environmental situation and a manageable planning complexity as well as for medium to long-term planning cycles with a low variability of the factory. Existing approaches already regard factory planning as a continuous process that makes it possible to react quickly to adaptation requirements. However, digital factory models are not yet used as a source of information for building data. Approaches which consider building information modeling (BIM) or digital factory models in general either do not refer to factory conversions or do not yet go beyond a concept. This deficit can be further substantiated. A method for factory conversion planning using a current digital building model is lacking. A corresponding approach must take into account both the existing approaches to factory planning and the use of digital factory models in practice. A literature review will be conducted first. In it, approaches to classic factory planning and approaches to conversion planning are examined. In addition, it will be investigated which approaches already contain digital factory models. In the second step, an approach is presented how digital factory models based on building information modeling can be used as a basis for augmented reality tablet applications. This application is suitable for construction sites and provides information on the costs and time required for conversion variants. Thus a fast decision making is supported. In summary, the paper provides an overview of existing factory planning approaches and critically examines the use of digital tools. Based on this preliminary work, an approach is presented, which suggests the sensible use of digital factory models for decision support in the case of conversion variants of the factory building. The augmented reality application is designed to summarize the most important information for decision-makers during a reconstruction process.Keywords: augmented reality, digital factory model, factory planning, restructuring
Procedia PDF Downloads 137428 Rendering Cognition Based Learning in Coherence with Development within the Context of PostgreSQL
Authors: Manuela Nayantara Jeyaraj, Senuri Sucharitharathna, Chathurika Senarath, Yasanthy Kanagaraj, Indraka Udayakumara
Abstract:
PostgreSQL is an Object Relational Database Management System (ORDBMS) that has been in existence for a while. Despite the superior features that it wraps and packages to manage database and data, the database community has not fully realized the importance and advantages of PostgreSQL. Hence, this research tends to focus on provisioning a better environment of development for PostgreSQL in order to induce the utilization and elucidate the importance of PostgreSQL. PostgreSQL is also known to be the world’s most elementary SQL-compliant open source ORDBMS. But, users have not yet resolved to PostgreSQL due to the facts that it is still under the layers and the complexity of its persistent textual environment for an introductory user. Simply stating this, there is a dire need to explicate an easy way of making the users comprehend the procedure and standards with which databases are created, tables and the relationships among them, manipulating queries and their flow based on conditions in PostgreSQL to help the community resolve to PostgreSQL at an augmented rate. Hence, this research under development within the context tends to initially identify the dominant features provided by PostgreSQL over its competitors. Following the identified merits, an analysis on why the database community holds a hesitance in migrating to PostgreSQL’s environment will be carried out. These will be modulated and tailored based on the scope and the constraints discovered. The resultant of the research proposes a system that will serve as a designing platform as well as a learning tool that will provide an interactive method of learning via a visual editor mode and incorporate a textual editor for well-versed users. The study is based on conjuring viable solutions that analyze a user’s cognitive perception in comprehending human computer interfaces and the behavioural processing of design elements. By providing a visually draggable and manipulative environment to work with Postgresql databases and table queries, it is expected to highlight the elementary features displayed by Postgresql over any other existent systems in order to grasp and disseminate the importance and simplicity offered by this to a hesitant user.Keywords: cognition, database, PostgreSQL, text-editor, visual-editor
Procedia PDF Downloads 282427 Environmental Conditions Simulation Device for Evaluating Fungal Growth on Wooden Surfaces
Authors: Riccardo Cacciotti, Jiri Frankl, Benjamin Wolf, Michael Machacek
Abstract:
Moisture fluctuations govern the occurrence of fungi-related problems in buildings, which may impose significant health risks for users and even lead to structural failures. Several numerical engineering models attempt to capture the complexity of mold growth on building materials. From real life observations, in cases with suppressed daily variations of boundary conditions, e.g. in crawlspaces, mold growth model predictions well correspond with the observed mold growth. On the other hand, in cases with substantial diurnal variations of boundary conditions, e.g. in the ventilated cavity of a cold flat roof, mold growth predicted by the models is significantly overestimated. This study, founded by the Grant Agency of the Czech Republic (GAČR 20-12941S), aims at gaining a better understanding of mold growth behavior on solid wood, under varying boundary conditions. In particular, the experimental investigation focuses on the response of mold to changing conditions in the boundary layer and its influence on heat and moisture transfer across the surface. The main results include the design and construction at the facilities of ITAM (Prague, Czech Republic) of an innovative device allowing for the simulation of changing environmental conditions in buildings. It consists of a square section closed circuit with rough dimensions 200 × 180 cm and cross section roughly 30 × 30 cm. The circuit is thermally insulated and equipped with an electric fan to control air flow inside the tunnel, a heat and humidity exchange unit to control the internal RH and variations in temperature. Several measuring points, including an anemometer, temperature and humidity sensor, a loading cell in the test section for recording mass changes, are provided to monitor the variations of parameters during the experiments. The research is ongoing and it is expected to provide the final results of the experimental investigation at the end of 2022.Keywords: moisture, mold growth, testing, wood
Procedia PDF Downloads 129426 The MicroRNA-2110 Suppressed Cell Proliferation and Migration Capacity in Hepatocellular Carcinoma Cells
Authors: Pelin Balcik Ercin
Abstract:
Introduction: ZEB transcription factor family member ZEB2, has a role in epithelial to mesenchymal transition during development and metastasis. The altered circulating extracellular miRNAs expression is observed in diseases, and extracellular miRNAs have an important role in cancer cell microenvironment. In ChIP-Seq study, the expression of miR-2110 was found to be regulated by ZEB2. In this study, the effects of miR2110 on cell proliferation and migration of hepatocellular carcinoma (HCC) cells were examined. Material and Methods: SNU398 cells transfected with mimic miR2110 (20nM) (HMI0375, Sigma-Aldrich) and negative control miR (HMC0002, Sigma-Aldrich). MicroRNA isolation was accomplished with miRVANA isolation kit according to manufacturer instructions. cDNA synthesis was performed expression, respectively, and calibrated with Ct of controls. The real-time quantitative PCR (RT-qPCR) reaction was performed using the TaqMan Fast Advanced Master Mix (Thermo Sci.). Ct values of miR2110 were normalized to miR-186-5p and miR16-5p for the intracellular gene. Cell proliferation analysis was analyzed with the xCELLigence RTCA System. Wound healing assay was analyzed with the ImageJ program and relative fold change calculated. Results: The mimic-miR-2110 transfected SNU398 cells nearly nine-fold (log2) more miR-2110 expressed compared to negative control transfected cells. The mimic-miR-2110 transfected HCC cell proliferation significantly inhibited compared to the negative control cells. Furthermore, miR-2110-SNU398 cell migration capacity was relatively four-fold decreased compared to negative control-miR-SNU398 cells. Conclusion: Our results suggest the miR-2110 inhibited cell proliferation and also miR-2110 negatively affect cell migration compared to control groups in HCC cells. These data suggest the complexity of microRNA EMT transcription factors regulation. These initial results are pointed out the predictive biomarker capacity of miR-2110 in HCC.Keywords: epithelial to mesenchymal transition, EMT, hepatocellular carcinoma cells, micro-RNA-2110, ZEB2
Procedia PDF Downloads 123425 Women and Terrorism in Nigeria: Policy Templates for Addressing Complex Challenges in a Changing Democratic State
Authors: Godiya Pius Atsiya
Abstract:
One of the most devastating impacts of terrorism on the Nigerian state is the danger it has posed on women, children and other vulnerable groups. The complexity of terrorism in Nigeria, especially in most parts of Northern Nigeria has entrenched unprecedented security challenges such as refugee crisis, kidnapping, food shortages, increase in death tolls, malnutrition, fear, rape and several other psychological factors. Of particular interest in this paper as it relates to terrorism is the high rate of Internally Displaced Persons(IDPs), with women, children and the aged being the most affected. Empirical evidence arising from recent development in Nigeria’s North-East geo-political zone shows that large numbers of refugees fleeing the Boko Haram attacks have doubled. The attendant consequences of this mass exodus of people in the affected areas are that the victims now suffer untold and unwarranted economic hardship. In another dimension, recent findings have it that most powerless women and young teenage girls have been forcefully conscripted into the Islamic extremist groups and used as shields. In some respect, these groups of people have been used as available tools for suicide bombing and other criminal tendencies, the result of which can be detrimental to social cohesion and integration. This work is a theoretical insight into terrorism discourses; hence, the paper relies on existing works of scholars in carrying out the research. The paper argues that the implications of terrorism on women gender have grounding effects on the moral psyche of women who are supposed to be home managers and custodians of morality in society. The burden of terrorism and all it tends to propagate has literally upturned social lives and hence, Nigeria is gradually being plunged into the Hobesian state of nature. As a panacea to resolving this social malaise, the paper submits that government and indeed, all stakeholders in the nation’s democratic project must expedite action to nip this trend in the bud. The paper sums up with conclusion and other alternative policy measures to mitigate the challenges of terrorism in Nigeria.Keywords: changing democratic state, policy measures, terrorism, women
Procedia PDF Downloads 230424 Dosimetry in Interventional Radiology Examinations for Occupational Exposure Monitoring
Authors: Ava Zarif Sanayei, Sedigheh Sina
Abstract:
Interventional radiology (IR) uses imaging guidance, including X-rays and CT scans, to deliver therapy precisely. Most IR procedures are performed under local anesthesia and start with a small needle being inserted through the skin, which may be called pinhole surgery or image-guided surgery. There is increasing concern about radiation exposure during interventional radiology procedures due to procedure complexity. The basic aim of optimizing radiation protection as outlined in ICRP 139, is to strike a balance between image quality and radiation dose while maximizing benefits, ensuring that diagnostic interpretation is satisfactory. This study aims to estimate the equivalent doses to the main trunk of the body for the Interventional radiologist and Superintendent using LiF: Mg, Ti (TLD-100) chips at the IR department of a hospital in Shiraz, Iran. In the initial stage, the dosimeters were calibrated with the use of various phantoms. Afterward, a group of dosimeters was prepared, following which they were used for three months. To measure the personal equivalent dose to the body, three TLD chips were put in a tissue-equivalent batch and used under a protective lead apron. After the completion of the duration, TLDs were read out by a TLD reader. The results revealed that these individuals received equivalent doses of 387.39 and 145.11 µSv, respectively. The findings of this investigation revealed that the total radiation exposure to the staff was less than the annual limit of occupational exposure. However, it's imperative to implement appropriate radiation protection measures. Although the dose received by the interventional radiologist is a bit noticeable, it may be due to the reason for using conventional equipment with over-couch x-ray tubes for interventional procedures. It is therefore important to use dedicated equipment and protective means such as glasses and screens whenever compatible with the intervention when they are available or have them fitted to equipment if they are not present. Based on the results, the placement of staff in an appropriate location led to increasing the dose to the radiologist. Manufacturing and installation of moveable lead curtains with a thickness of 0.25 millimeters can effectively minimize the radiation dose to the body. Providing adequate training on radiation safety principles, particularly for technologists, can be an optimal approach to further decreasing exposure.Keywords: interventional radiology, personal monitoring, radiation protection, thermoluminescence dosimetry
Procedia PDF Downloads 61423 Effectiveness with Respect to Time-To-Market and the Impacts of Late-Stage Design Changes in Rapid Development Life Cycles
Authors: Parth Shah
Abstract:
The author examines the recent trend where business organizations are significantly reducing their developmental cycle times to stay competitive in today’s global marketspace. The author proposes a rapid systems engineering framework to address late design changes and allow for flexibility (i.e. to react to unexpected or late changes and its impacts) during the product development cycle using a Systems Engineering approach. A System Engineering approach is crucial in today’s product development to deliver complex products into the marketplace. Design changes can occur due to shortened timelines and also based on initial consumer feedback once a product or service is in the marketplace. The ability to react to change and address customer expectations in a responsive and cost-efficient manner is crucial for any organization to succeed. Past literature, research, and methods such as concurrent development, simultaneous engineering, knowledge management, component sharing, rapid product integration, tailored systems engineering processes, and studies on reducing product development cycles all suggest a research gap exist in specifically addressing late design changes due to the shortening of life cycle environments in increasingly competitive markets. The author’s research suggests that 1) product development cycles time scales are now measured in months instead of years, 2) more and more products have interdepended systems and environments that are fast-paced and resource critical, 3) product obsolesce is higher and more organizations are releasing products and services frequently, and 4) increasingly competitive markets are leading to customization based on consumer feedback. The author will quantify effectiveness with respect to success factors such as time-to-market, return-of-investment, life cycle time and flexibility in late design changes by complexity of product or service, number of late changes and ability to react and reduce late design changes.Keywords: product development, rapid systems engineering, scalability, systems engineering, systems integration, systems life cycle
Procedia PDF Downloads 203422 SynKit: A Event-Driven and Scalable Microservices-Based Kitting System
Authors: Bruno Nascimento, Cristina Wanzeller, Jorge Silva, João A. Dias, André Barbosa, José Ribeiro
Abstract:
The increasing complexity of logistics operations stems from evolving business needs, such as the shift from mass production to mass customization, which demands greater efficiency and flexibility. In response, Industry 4.0 and 5.0 technologies provide improved solutions to enhance operational agility and better meet market demands. The management of kitting zones, combined with the use of Autonomous Mobile Robots, faces challenges related to coordination, resource optimization, and rapid response to customer demand fluctuations. Additionally, implementing lean manufacturing practices in this context must be carefully orchestrated by intelligent systems and human operators to maximize efficiency without sacrificing the agility required in an advanced production environment. This paper proposes and implements a microservices-based architecture integrating principles from Industry 4.0 and 5.0 with lean manufacturing practices. The architecture enhances communication and coordination between autonomous vehicles and kitting management systems, allowing more efficient resource utilization and increased scalability. The proposed architecture focuses on the modularity and flexibility of operations, enabling seamless flexibility to change demands and the efficient allocation of resources in realtime. Conducting this approach is expected to significantly improve logistics operations’ efficiency and scalability by reducing waste and optimizing resource use while improving responsiveness to demand changes. The implementation of this architecture provides a robust foundation for the continuous evolution of kitting management and process optimization. It is designed to adapt to dynamic environments marked by rapid shifts in production demands and real-time decision-making. It also ensures seamless integration with automated systems, aligning with Industry 4.0 and 5.0 needs while reinforcing Lean Manufacturing principles.Keywords: microservices, event-driven, kitting, AMR, lean manufacturing, industry 4.0, industry 5.0
Procedia PDF Downloads 20421 Cognitive Dissonance in Robots: A Computational Architecture for Emotional Influence on the Belief System
Authors: Nicolas M. Beleski, Gustavo A. G. Lugo
Abstract:
Robotic agents are taking more and increasingly important roles in society. In order to make these robots and agents more autonomous and efficient, their systems have grown to be considerably complex and convoluted. This growth in complexity has led recent researchers to investigate forms to explain the AI behavior behind these systems in search for more trustworthy interactions. A current problem in explainable AI is the inner workings with the logic inference process and how to conduct a sensibility analysis of the process of valuation and alteration of beliefs. In a social HRI (human-robot interaction) setup, theory of mind is crucial to ease the intentionality gap and to achieve that we should be able to infer over observed human behaviors, such as cases of cognitive dissonance. One specific case inspired in human cognition is the role emotions play on our belief system and the effects caused when observed behavior does not match the expected outcome. In such scenarios emotions can make a person wrongly assume the antecedent P for an observed consequent Q, and as a result, incorrectly assert that P is true. This form of cognitive dissonance where an unproven cause is taken as truth induces changes in the belief base which can directly affect future decisions and actions. If we aim to be inspired by human thoughts in order to apply levels of theory of mind to these artificial agents, we must find the conditions to replicate these observable cognitive mechanisms. To achieve this, a computational architecture is proposed to model the modulation effect emotions have on the belief system and how it affects logic inference process and consequently the decision making of an agent. To validate the model, an experiment based on the prisoner's dilemma is currently under development. The hypothesis to be tested involves two main points: how emotions, modeled as internal argument strength modulators, can alter inference outcomes, and how can explainable outcomes be produced under specific forms of cognitive dissonance.Keywords: cognitive architecture, cognitive dissonance, explainable ai, sensitivity analysis, theory of mind
Procedia PDF Downloads 130420 The Basin Management Methodology for Integrated Water Resources Management and Development
Authors: Julio Jesus Salazar, Max Jesus De Lama
Abstract:
The challenges of water management are aggravated by global change, which implies high complexity and associated uncertainty; water management is difficult because water networks cross domains (natural, societal, and political), scales (space, time, jurisdictional, institutional, knowledge, etc.) and levels (area: patches to global; knowledge: a specific case to generalized principles). In this context, we need to apply natural and non-natural measures to manage water and soil. The Basin Management Methodology considers multifunctional measures of natural water retention and erosion control and soil formation to protect water resources and address the challenges related to the recovery or conservation of the ecosystem, as well as natural characteristics of water bodies, to improve the quantitative status of water bodies and reduce vulnerability to floods and droughts. This method of water management focuses on the positive impacts of the chemical and ecological status of water bodies, restoration of the functioning of the ecosystem and its natural services; thus, contributing to both adaptation and mitigation of climate change. This methodology was applied in 7 interventions in the sub-basin of the Shullcas River in Huancayo-Junín-Peru, obtaining great benefits in the framework of the participation of alliances of actors and integrated planning scenarios. To implement the methodology in the sub-basin of the Shullcas River, a process called Climate Smart Territories (CST) was used; with which the variables were characterized in a highly complex space. The diagnosis was then worked using risk management and adaptation to climate change. Finally, it was concluded with the selection of alternatives and projects of this type. Therefore, the CST approach and process face the challenges of climate change through integrated, systematic, interdisciplinary and collective responses at different scales that fit the needs of ecosystems and their services that are vital to human well-being. This methodology is now replicated at the level of the Mantaro river basin, improving with other initiatives that lead to the model of a resilient basin.Keywords: climate-smart territories, climate change, ecosystem services, natural measures, Climate Smart Territories (CST) approach
Procedia PDF Downloads 149419 An Investigative Study into Good Governance in the Non-Profit Sector in South Africa: A Systems Approach Perspective
Authors: Frederick M. Dumisani Xaba, Nokuthula G. Khanyile
Abstract:
There is a growing demand for greater accountability, transparency and ethical conduct based on sound governance principles in the developing world. Funders, donors and sponsors are increasingly demanding more transparency, better value for money and adherence to good governance standards. The drive towards improved governance measures is largely influenced by the need to ‘plug the leaks’, deal with malfeasance, engender greater levels of accountability and good governance and to ultimately attract further funding or investment. This is the case with the Non-Profit Organizations (NPOs) in South Africa in general, and in the province of KwaZulu-Natal in particular. The paper draws from the good governance theory, stakeholder theory and systems thinking to critically examine the requirements for good governance for the NPO sector from a theoretical and legislative point and to systematically looks at the contours of governance currently among the NPOs. The paper did this through the rigorous examination of the vignettes of cases of governance among selected NPOs based in KwaZulu-Natal. The study used qualitative and quantitative research methodologies through document analysis, literature review, semi-structured interviews, focus groups and statistical analysis from the various primary and secondary sources. It found some good cases of good governance but also found frightening levels of poor governance. There was an exponential growth of NPOs registered during the period under review, equally so there was an increase in cases of non-compliance to good governance practices. NPOs operate in an increasingly complex environment. There is contestation for influence and access to resources. Stakeholder management is poorly conceptualized and executed. Recognizing that the NPO sector operates in an environment characterized by complexity, constant changes, unpredictability, contestation, diversity and divergent views of different stakeholders, there is a need to apply legislative and systems thinking approaches to strengthen governance to withstand this turbulence through a capacity development model that recognizes these contextual and environmental challenges.Keywords: good governance, non-profit organizations, stakeholder theory, systems theory
Procedia PDF Downloads 120418 Constructivism and Situational Analysis as Background for Researching Complex Phenomena: Example of Inclusion
Authors: Radim Sip, Denisa Denglerova
Abstract:
It’s impossible to capture complex phenomena, such as inclusion, with reductionism. The most common form of reductionism is the objectivist approach, where processes and relationships are reduced to entities and clearly outlined phases, with a consequent search for relationships between them. Constructivism as a paradigm and situational analysis as a methodological research portfolio represent a way to avoid the dominant objectivist approach. They work with a situation, i.e. with the essential blending of actors and their environment. Primary transactions are taking place between actors and their surroundings. Researchers create constructs based on their need to solve a problem. Concepts therefore do not describe reality, but rather a complex of real needs in relation to the available options how such needs can be met. For examination of a complex problem, corresponding methodological tools and overall design of the research are necessary. Using an original research on inclusion in the Czech Republic as an example, this contribution demonstrates that inclusion is not a substance easily described, but rather a relationship field changing its forms in response to its actors’ behaviour and current circumstances. Inclusion consists of dynamic relationship between an ideal, real circumstances and ways to achieve such ideal under the given circumstances. Such achievement has many shapes and thus cannot be captured by description of objects. It can be expressed in relationships in the situation defined by time and space. Situational analysis offers tools to examine such phenomena. It understands a situation as a complex of dynamically changing aspects and prefers relationships and positions in the given situation over a clear and final definition of actors, entities, etc. Situational analysis assumes creation of constructs as a tool for solving a problem at hand. It emphasizes the meanings that arise in the process of coordinating human actions, and the discourses through which these meanings are negotiated. Finally, it offers “cartographic tools” (situational maps, socials worlds / arenas maps, positional maps) that are able to capture the complexity in other than linear-analytical ways. This approach allows for inclusion to be described as a complex of phenomena taking place with a certain historical preference, a complex that can be overlooked if analyzed with a more traditional approach.Keywords: constructivism, situational analysis, objective realism, reductionism, inclusion
Procedia PDF Downloads 146417 New Insights into Ethylene and Auxin Interplay during Tomato Ripening
Authors: Bruna Lima Gomes, Vanessa Caroline De Barros Bonato, Luciano Freschi, Eduardo Purgatto
Abstract:
Plant hormones are long known to be tightly associated with fruit development and are involved in controlling various aspects of fruit ripening. For fleshy fruits, ripening is characterized for changes in texture, color, aroma and other parameters that markedly contribute to its quality. Ethylene is one of the major players regulating the ripening-related processes, but emerging evidences suggest that auxin is also part of this dynamic control. Thus, the aim of this study was providing new insights into the auxin role during ripening and the hormonal interplay between auxin and ethylene. For that, tomato fruits (Micro-Tom) were collected at mature green stage and separated in four groups: one for indole-3-acetic acid (IAA) treatment, one for ethylene, one for a combination of IAA and ethylene, and one for control. Hormone solution was injected through the stylar apex, while mock samples were injected with buffer only. For ethylene treatments, fruits were exposed to gaseous hormone. Then, fruits were left to ripen under standard conditions and to assess ripening development, hue angle was reported as color indicator and ethylene production was measured by gas chromatography. The transcript levels of three ripening-related ethylene receptors (LeETR3, LeETR4 and LeETR6) were evaluated by RT-qPCR. Results showed that ethylene treatment induced ripening, stimulated ethylene production, accelerated color changes and induced receptor expression, as expected. Nonetheless, auxin treatment showed the opposite effect once fruits remained green for longer time than control group and ethylene perception has changed, taking account the reduced levels of receptor transcripts. Further, treatment with both hormones revealed that auxin effect in delaying ripening was predominant, even with higher levels of ethylene. Altogether, the data suggest that auxin modulates several aspects of the tomato fruit ripening modifying the ethylene perception. The knowledge about hormonal control of fruit development will help design new strategies for effective manipulation of ripening regarding fruit quality and brings a new level of complexity on fruit ripening regulation.Keywords: ethylene, auxin, fruit ripening, hormonal crosstalk
Procedia PDF Downloads 458416 A Case Study on an Integrated Analysis of Well Control and Blow out Accident
Authors: Yasir Memon
Abstract:
The complexity and challenges in the offshore industry are increasing more than the past. The oil and gas industry is expanding every day by accomplishing these challenges. More challenging wells such as longer and deeper are being drilled in today’s environment. Blowout prevention phenomena hold a worthy importance in oil and gas biosphere. In recent, so many past years when the oil and gas industry was growing drilling operation were extremely dangerous. There was none technology to determine the pressure of reservoir and drilling hence was blind operation. A blowout arises when an uncontrolled reservoir pressure enters in wellbore. A potential of blowout in the oil industry is the danger for the both environment and the human life. Environmental damage, state/country regulators, and the capital investment causes in loss. There are many cases of blowout in the oil the gas industry caused damage to both human and the environment. A huge capital investment is being in used to stop happening of blowout through all over the biosphere to bring damage at the lowest level. The objective of this study is to promote safety and good resources to assure safety and environmental integrity in all operations during drilling. This study shows that human errors and management failure is the main cause of blowout therefore proper management with the wise use of precautions, prevention methods or controlling techniques can reduce the probability of blowout to a minimum level. It also discusses basic procedures, concepts and equipment involved in well control methods and various steps using at various conditions. Furthermore, another aim of this study work is to highlight management role in oil gas operations. Moreover, this study analyze the causes of Blowout of Macondo well occurred in the Gulf of Mexico on April 20, 2010, and deliver the recommendations and analysis of various aspect of well control methods and also provides the list of mistakes and compromises that British Petroleum and its partner were making during drilling and well completion methods and also the Macondo well disaster happened due to various safety and development rules violation. This case study concludes that Macondo well blowout disaster could be avoided with proper management of their personnel’s and communication between them and by following safety rules/laws it could be brought to minimum environmental damage.Keywords: energy, environment, oil and gas industry, Macondo well accident
Procedia PDF Downloads 185415 The Investigate Relationship between Moral Hazard and Corporate Governance with Earning Forecast Quality in the Tehran Stock Exchange
Authors: Fatemeh Rouhi, Hadi Nassiri
Abstract:
Earning forecast is a key element in economic decisions but there are some situations, such as conflicts of interest in financial reporting, complexity and lack of direct access to information has led to the phenomenon of information asymmetry among individuals within the organization and external investors and creditors that appear. The adverse selection and moral hazard in the investor's decision and allows direct assessment of the difficulties associated with data by users makes. In this regard, the role of trustees in corporate governance disclosure is crystallized that includes controls and procedures to ensure the lack of movement in the interests of the company's management and move in the direction of maximizing shareholder and company value. Therefore, the earning forecast of companies in the capital market and the need to identify factors influencing this study was an attempt to make relationship between moral hazard and corporate governance with earning forecast quality companies operating in the capital market and its impact on Earnings Forecasts quality by the company to be established. Getting inspiring from the theoretical basis of research, two main hypotheses and sub-hypotheses are presented in this study, which have been examined on the basis of available models, and with the use of Panel-Data method, and at the end, the conclusion has been made at the assurance level of 95% according to the meaningfulness of the model and each independent variable. In examining the models, firstly, Chow Test was used to specify either Panel Data method should be used or Pooled method. Following that Housman Test was applied to make use of Random Effects or Fixed Effects. Findings of the study show because most of the variables are positively associated with moral hazard with earnings forecasts quality, with increasing moral hazard, earning forecast quality companies listed on the Tehran Stock Exchange is increasing. Among the variables related to corporate governance, board independence variables have a significant relationship with earnings forecast accuracy and earnings forecast bias but the relationship between board size and earnings forecast quality is not statistically significant.Keywords: corporate governance, earning forecast quality, moral hazard, financial sciences
Procedia PDF Downloads 321414 Genotyping and Phylogeny of Phaeomoniella Genus Associated with Grapevine Trunk Diseases in Algeria
Authors: A. Berraf-Tebbal, Z. Bouznad, , A.J.L. Phillips
Abstract:
Phaeomoniella is a fungus genus in the mitosporic ascomycota which includes Phaeomoniella chlamydospora specie associated with two declining diseases on grapevine (Vitis vinifera) namely Petri disease and esca. Recent studies have shown that several Phaeomoniella species also cause disease on many other woody crops, such as forest trees and woody ornamentals. Two new species, Phaeomoniella zymoides and Phaeomoniella pinifoliorum H.B. Lee, J.Y. Park, R.C. Summerbell et H.S. Jung, were isolated from the needle surface of Pinus densiflora Sieb. et Zucc. in Korea. The identification of species in Phaeomoniella genus can be a difficult task if based solely on morphological and cultural characters. In this respect, the application of molecular methods, particularly PCR-based techniques, may provide an important contribution. MSP-PCR (microsatellite primed-PCR) fingerprinting has proven useful in the molecular typing of fungal strains. The high discriminatory potential of this method is particularly useful when dealing with closely related or cryptic species. In the present study, the application of PCR fingerprinting was performed using the micro satellite primer M13 for the purpose of species identification and strain typing of 84 Phaeomoniella -like isolates collected from grapevines with typical symptoms of dieback. The bands produced by MSP-PCR profiles divided the strains into 3 clusters and 5 singletons with a reproducibility level of 80%. Representative isolates from each group and, when possible, isolates from Eutypa dieback and esca symptoms were selected for sequencing of the ITS region. The ITS sequences for the 16 isolates selected from the MSP-PCR profiles were combined and aligned with sequences of 18 isolates retrieved from GenBank, representing a selection of all known Phaeomoniella species. DNA sequences were compared with those available in GenBank using Neighbor-joining (NJ) and Maximum-parsimony (MP) analyses. The phylogenetic trees of the ITS region revealed that the Phaeomoniella isolates clustered with Phaeomoniella chlamydospora reference sequences with a bootstrap support of 100 %. The complexity of the pathosystems vine-trunk diseases shows clearly the need to identify unambiguously the fungal component in order to allow a better understanding of the etiology of these diseases and justify the establishment of control strategies against these fungal agents.Keywords: Genotyping, MSP-PCR, ITS, phylogeny, trunk diseases
Procedia PDF Downloads 476413 Virtual Approach to Simulating Geotechnical Problems under Both Static and Dynamic Conditions
Authors: Varvara Roubtsova, Mohamed Chekired
Abstract:
Recent studies on the numerical simulation of geotechnical problems show the importance of considering the soil micro-structure. At this scale, soil is a discrete particle medium where the particles can interact with each other and with water flow under external forces, structure loads or natural events. This paper presents research conducted in a virtual laboratory named SiGran, developed at IREQ (Institut de recherche d’Hydro-Quebec) for the purpose of investigating a broad range of problems encountered in geotechnics. Using Discrete Element Method (DEM), SiGran simulated granular materials directly by applying Newton’s laws to each particle. The water flow was simulated by using Marker and Cell method (MAC) to solve the full form of Navier-Stokes’s equation for non-compressible viscous liquid. In this paper, examples of numerical simulation and their comparisons with real experiments have been selected to show the complexity of geotechnical research at the micro level. These examples describe transient flows into a porous medium, interaction of particles in a viscous flow, compacting of saturated and unsaturated soils and the phenomenon of liquefaction under seismic load. They also provide an opportunity to present SiGran’s capacity to compute the distribution and evolution of energy by type (particle kinetic energy, particle internal elastic energy, energy dissipated by friction or as a result of viscous interaction into flow, and so on). This work also includes the first attempts to apply micro discrete results on a macro continuum level where the Smoothed Particle Hydrodynamics (SPH) method was used to resolve the system of governing equations. The material behavior equation is based on the results of simulations carried out at a micro level. The possibility of combining three methods (DEM, MAC and SPH) is discussed.Keywords: discrete element method, marker and cell method, numerical simulation, multi-scale simulations, smoothed particle hydrodynamics
Procedia PDF Downloads 300412 Building a Blockchain-based Internet of Things
Authors: Rob van den Dam
Abstract:
Today’s Internet of Things (IoT) comprises more than a billion intelligent devices, connected via wired/wireless communications. The expected proliferation of hundreds of billions more places us at the threshold of a transformation sweeping across the communications industry. Yet, we found that the IoT architecture and solutions that currently work for billions of devices won’t necessarily scale to tomorrow’s hundreds of billions of devices because of high cost, lack of privacy, not future-proof, lack of functional value and broken business models. As the IoT scales exponentially, decentralized networks have the potential to reduce infrastructure and maintenance costs to manufacturers. Decentralization also promises increased robustness by removing single points of failure that could exist in traditional centralized networks. By shifting the power in the network from the center to the edges, devices gain greater autonomy and can become points of transactions and economic value creation for owners and users. To validate the underlying technology vision, IBM jointly developed with Samsung Electronics the autonomous decentralized peer-to- peer proof-of-concept (PoC). The primary objective of this PoC was to establish a foundation on which to demonstrate several capabilities that are fundamental to building a decentralized IoT. Though many commercial systems in the future will exist as hybrid centralized-decentralized models, the PoC demonstrated a fully distributed proof. The PoC (a) validated the future vision for decentralized systems to extensively augment today’s centralized solutions, (b) demonstrated foundational IoT tasks without the use of centralized control, (c) proved that empowered devices can engage autonomously in marketplace transactions. The PoC opens the door for the communications and electronics industry to further explore the challenges and opportunities of potential hybrid models that can address the complexity and variety of requirements posed by the internet that continues to scale. Contents: (a) The new approach for an IoT that will be secure and scalable, (b) The three foundational technologies that are key for the future IoT, (c) The related business models and user experiences, (d) How such an IoT will create an 'Economy of Things', (e) The role of users, devices, and industries in the IoT future, (f) The winners in the IoT economy.Keywords: IoT, internet, wired, wireless
Procedia PDF Downloads 335411 Characterization of Complex Gold Ores for Preliminary Process Selection: The Case of Kapanda, Ibindi, Mawemeru, and Itumbi in Tanzania
Authors: Sospeter P. Maganga, Alphonce Wikedzi, Mussa D. Budeba, Samwel V. Manyele
Abstract:
This study characterizes complex gold ores (elemental and mineralogical composition, gold distribution, ore grindability, and mineral liberation) for preliminary process selection. About 200 kg of ore samples were collected from each location using systematic sampling by mass interval. Ores were dried, crushed, milled, and split into representative sub-samples (about 1 kg) for elemental and mineralogical composition analyses using X-ray fluorescence (XRF), fire assay finished with Atomic Absorption Spectrometer (AAS), and X-ray Diffraction (XRD) methods, respectively. The gold distribution was studied on size-by-size fractions, while ore grindability was determined using the standard Bond test. The mineral liberation analysis was conducted using ThermoFisher Scientific Mineral Liberation Analyzer (MLA) 650, where unsieved polished grain mounts (80% passing 700 µm) were used as MLA feed. Two MLA measurement modes, X-ray modal analysis (XMOD) and sparse phase liberation-grain X-ray mapping analysis (SPL-GXMAP), were employed. At least two cyanide consumers (Cu, Fe, Pb, and Zn) and kinetics impeders (Mn, S, As, and Bi) were present in all locations investigated. Copper content at Kapanda (0.77% Cu) and Ibindi (7.48% Cu) exceeded the recommended threshold of 0.5% Cu for direct cyanidation. The gold ore at Ibindi indicated a higher rate of grinding compared to other locations. This could be explained by the highest grindability (2.119 g/rev.) and lowest Bond work index (10.213 kWh/t) values. The pyrite-marcasite, chalcopyrite, galena, and siderite were identified as major gold, copper, lead, and iron-bearing minerals, respectively, with potential for economic extraction. However, only gold and copper can be recovered under conventional milling because of grain size issues (galena is exposed by 10%) and process complexity (difficult to concentrate and smelt iron from siderite). Therefore, the preliminary process selection is copper flotation followed by gold cyanidation for Kapanda and Ibindi ores, whereas gold cyanidation with additives such as glycine or ammonia is selected for Mawemeru and Itumbi ores because of low concentrations of Cu, Pb, Fe, and Zn minerals.Keywords: complex gold ores, mineral liberation, ore characterization, ore grindability
Procedia PDF Downloads 72410 National Assessment for Schools in Saudi Arabia: Score Reliability and Plausible Values
Authors: Dimiter M. Dimitrov, Abdullah Sadaawi
Abstract:
The National Assessment for Schools (NAFS) in Saudi Arabia consists of standardized tests in Mathematics, Reading, and Science for school grade levels 3, 6, and 9. One main goal is to classify students into four categories of NAFS performance (minimal, basic, proficient, and advanced) by schools and the entire national sample. The NAFS scoring and equating is performed on a bounded scale (D-scale: ranging from 0 to 1) in the framework of the recently developed “D-scoring method of measurement.” The specificity of the NAFS measurement framework and data complexity presented both challenges and opportunities to (a) the estimation of score reliability for schools, (b) setting cut-scores for the classification of students into categories of performance, and (c) generating plausible values for distributions of student performance on the D-scale. The estimation of score reliability at the school level was performed in the framework of generalizability theory (GT), with students “nested” within schools and test items “nested” within test forms. The GT design was executed via a multilevel modeling syntax code in R. Cut-scores (on the D-scale) for the classification of students into performance categories was derived via a recently developed method of standard setting, referred to as “Response Vector for Mastery” (RVM) method. For each school, the classification of students into categories of NAFS performance was based on distributions of plausible values for the students’ scores on NAFS tests by grade level (3, 6, and 9) and subject (Mathematics, Reading, and Science). Plausible values (on the D-scale) for each individual student were generated via random selection from a statistical logit-normal distribution with parameters derived from the student’s D-score and its conditional standard error, SE(D). All procedures related to D-scoring, equating, generating plausible values, and classification of students into performance levels were executed via a computer program in R developed for the purpose of NAFS data analysis.Keywords: large-scale assessment, reliability, generalizability theory, plausible values
Procedia PDF Downloads 17409 Tea and Its Working Methodology in the Biomass Estimation of Poplar Species
Authors: Pratima Poudel, Austin Himes, Heidi Renninger, Eric McConnel
Abstract:
Populus spp. (poplar) are the fastest-growing trees in North America, making them ideal for a range of applications as they can achieve high yields on short rotations and regenerate by coppice. Furthermore, poplar undergoes biochemical conversion to fuels without complexity, making it one of the most promising, purpose-grown, woody perennial energy sources. Employing wood-based biomass for bioenergy offers numerous benefits, including reducing greenhouse gas (GHG) emissions compared to non-renewable traditional fuels, the preservation of robust forest ecosystems, and creating economic prospects for rural communities.In order to gain a better understanding of the potential use of poplar as a biomass feedstock for biofuel in the southeastern US, the conducted a techno-economic assessment (TEA). This assessment is an analytical approach that integrates technical and economic factors of a production system to evaluate its economic viability. the TEA specifically focused on a short rotation coppice system employing a single-pass cut-and-chip harvesting method for poplar. It encompassed all the costs associated with establishing dedicated poplar plantations, including land rent, site preparation, planting, fertilizers, and herbicides. Additionally, we performed a sensitivity analysis to evaluate how different costs can affect the economic performance of the poplar cropping system. This analysis aimed to determine the minimum average delivered selling price for one metric ton of biomass necessary to achieve a desired rate of return over the cropping period. To inform the TEA, data on the establishment, crop care activities, and crop yields were derived from a field study conducted at the Mississippi Agricultural and Forestry Experiment Station's Bearden Dairy Research Center in Oktibbeha County and Pontotoc Ridge-Flatwood Branch Experiment Station in Pontotoc County.Keywords: biomass, populus species, sensitivity analysis, technoeconomic analysis
Procedia PDF Downloads 82408 Relationships Between the Petrophysical and Mechanical Properties of Rocks and Shear Wave Velocity
Authors: Anamika Sahu
Abstract:
The Himalayas, like many mountainous regions, is susceptible to multiple hazards. In recent times, the frequency of such disasters is continuously increasing due to extreme weather phenomena. These natural hazards are responsible for irreparable human and economic loss. The Indian Himalayas has repeatedly been ruptured by great earthquakes in the past and has the potential for a future large seismic event as it falls under the seismic gap. Damages caused by earthquakes are different in different localities. It is well known that, during earthquakes, damage to the structure is associated with the subsurface conditions and the quality of construction materials. So, for sustainable mountain development, prior estimation of site characterization will be valuable for designing and constructing the space area and for efficient mitigation of the seismic risk. Both geotechnical and geophysical investigation of the subsurface is required to describe the subsurface complexity. In mountainous regions, geophysical methods are gaining popularity as areas can be studied without disturbing the ground surface, and also these methods are time and cost-effective. The MASW method is used to calculate the Vs30. Vs30 is the average shear wave velocity for the top 30m of soil. Shear wave velocity is considered the best stiffness indicator, and the average of shear wave velocity up to 30 m is used in National Earthquake Hazards Reduction Program (NEHRP) provisions (BSSC,1994) and Uniform Building Code (UBC), 1997 classification. Parameters obtained through geotechnical investigation have been integrated with findings obtained through the subsurface geophysical survey. Joint interpretation has been used to establish inter-relationships among mineral constituents, various textural parameters, and unconfined compressive strength (UCS) with shear wave velocity. It is found that results obtained through the MASW method fitted well with the laboratory test. In both conditions, mineral constituents and textural parameters (grain size, grain shape, grain orientation, and degree of interlocking) control the petrophysical and mechanical properties of rocks and the behavior of shear wave velocity.Keywords: MASW, mechanical, petrophysical, site characterization
Procedia PDF Downloads 83407 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate
Procedia PDF Downloads 123406 Rheometer Enabled Study of Tissue/biomaterial Frequency-Dependent Properties
Authors: Polina Prokopovich
Abstract:
Despite the well-established dependence of cartilage mechanical properties on the frequency of the applied load, most research in the field is carried out in either load-free or constant load conditions because of the complexity of the equipment required for the determination of time-dependent properties. These simpler analyses provide a limited representation of cartilage properties thus greatly reducing the impact of the information gathered hindering the understanding of the mechanisms involved in this tissue replacement, development and pathology. More complex techniques could represent better investigative methods, but their uptake in cartilage research is limited by the highly specialised training required and cost of the equipment. There is, therefore, a clear need for alternative experimental approaches to cartilage testing to be deployed in research and clinical settings using more user-friendly and financial accessible devices. Frequency dependent material properties can be determined through rheometry that is an easy to use requiring a relatively inexpensive device; we present how a commercial rheometer can be adapted to determine the viscoelastic properties of articular cartilage. Frequency-sweep tests were run at various applied normal loads on immature, mature and trypsinased (as model of osteoarthritis) cartilage samples to determine the dynamic shear moduli (G*, G′ G″) of the tissues. Moduli increased with increasing frequency and applied load; mature cartilage had generally the highest moduli and GAG depleted samples the lowest. Hydraulic permeability (KH) was estimated from the rheological data and decreased with applied load; GAG depleted cartilage exhibited higher hydraulic permeability than either immature or mature tissues. The rheometer-based methodology developed was validated by the close comparison of the rheometer-obtained cartilage characteristics (G*, G′, G″, KH) with results obtained with more complex testing techniques available in literature. Rheometry is relatively simpler and does not require highly capital intensive machinery and staff training is more accessible; thus the use of a rheometer would represent a cost-effective approach for the determination of frequency-dependent properties of cartilage for more comprehensive and impactful results for both healthcare professional and R&D.Keywords: tissue, rheometer, biomaterial, cartilage
Procedia PDF Downloads 79405 Control of Lymphatic Remodelling by miR-132
Authors: Valeria Arcucci, Musarat Ishaq, Steven A. Stacker, Greg J. Goodall, Marc G. Achen
Abstract:
Metastasis is the lethal aspect of cancer for most patients. Remodelling of lymphatic vessels associated with a tumour is a key initial step in metastasis because it facilitates the entry of cancer cells into the lymphatic vasculature and their spread to lymph nodes and distant organs. Although it is clear that vascular endothelial growth factors (VEGFs), such as VEGF-C and VEGF-D, are key drivers of lymphatic remodelling, the means by which many signaling pathways in endothelial cells are coordinately regulated to drive growth and remodelling of lymphatics in cancer is not understood. We seek to understand the broader molecular mechanisms that control cancer metastasis, and are focusing on microRNAs, which coordinately regulate signaling pathways involved in complex biological responses in health and disease. Here, using small RNA sequencing, we found that a specific microRNA, miR-132, is upregulated in expression in lymphatic endothelial cells (LECs) in response to the lymphangiogenic growth factors. Interestingly, ectopic expression of miR-132 in LECs in vitro stimulated proliferation and tube formation of these cells. Moreover, miR-132 is expressed in lymphatic vessels of a subset of human breast tumours which were previously found to express high levels of VEGF-D by immunohistochemical analysis on tumour tissue microarrays. In order to dissect the complexity of regulation by miR-132 in lymphatic biology, we performed Argonaute HITS-CLIP, which led us to identify the miR-132-mRNA interactome in LECs. We found that this microRNA in LECs is involved in the control of many different pathways mainly involved in cell proliferation and regulation of the extracellular matrix and cell-cell junctions. We are now exploring the functional significance of miR-132 targets in the biology of LECs using biochemical techniques, functional in vitro cell assays and in vivo lymphangiogenesis assays. This project will ultimately define the molecular regulation of lymphatic remodelling by miR-132, and thereby identify potential therapeutic targets for drugs designed to restrict the growth and remodelling of tumour lymphatics resulting in metastatic spread.Keywords: argonaute HITS-CLIP, cancer, lymphatic remodelling, miR-132, VEGF
Procedia PDF Downloads 126