Search results for: complexity leadership
32 Study of the Diaphragm Flexibility Effect on the Inelastic Seismic Response of Thin Wall Reinforced Concrete Buildings (TWRCB): A Purpose to Reduce the Uncertainty in the Vulnerability Estimation
Authors: A. Zapata, Orlando Arroyo, R. Bonett
Abstract:
Over the last two decades, the growing demand for housing in Latin American countries has led to the development of construction projects based on low and medium-rise buildings with thin reinforced concrete walls. This system, known as Thin Walls Reinforced Concrete Buildings (TWRCB), uses walls with thicknesses from 100 to 150 millimetres, with flexural reinforcement formed by welded wire mesh (WWM) with diameters between 5 and 7 millimetres, arranged in one or two layers. These walls often have irregular structural configurations, including combinations of rectangular shapes. Experimental and numerical research conducted in regions where this structural system is commonplace indicates inherent weaknesses, such as limited ductility due to the WWM reinforcement and thin element dimensions. Because of its complexity, numerical analyses have relied on two-dimensional models that don't explicitly account for the floor system, even though it plays a crucial role in distributing seismic forces among the resilient elements. Nonetheless, the numerical analyses assume a rigid diaphragm hypothesis. For this purpose, two study cases of buildings were selected, low-rise and mid-rise characteristics of TWRCB in Colombia. The buildings were analyzed in Opensees using the MVLEM-3D for walls and shell elements to simulate the slabs to involve the effect of coupling diaphragm in the nonlinear behaviour. Three cases are considered: a) models without a slab, b) models with rigid slabs, and c) models with flexible slabs. An incremental static (pushover) and nonlinear dynamic analyses were carried out using a set of 44 far-field ground motions of the FEMA P-695, scaled to 1.0 and 1.5 factors to consider the probability of collapse for the design base earthquake (DBE) and the maximum considered earthquake (MCE) for the model, according to the location sites and hazard zone of the archetypes in the Colombian NSR-10. Shear base capacity, maximum displacement at the roof, walls shear base individual demands and probabilities of collapse were calculated, to evaluate the effect of absence, rigid and flexible slabs in the nonlinear behaviour of the archetype buildings. The pushover results show that the building exhibits an overstrength between 1.1 to 2 when the slab is considered explicitly and depends on the structural walls plan configuration; additionally, the nonlinear behaviour considering no slab is more conservative than if the slab is represented. Include the flexible slab in the analysis remarks the importance to consider the slab contribution in the shear forces distribution between structural elements according to design resistance and rigidity. The dynamic analysis revealed that including the slab reduces the collapse probability of this system due to have lower displacements and deformations, enhancing the safety of residents and the seismic performance. The strategy of including the slab in modelling is important to capture the real effect on the distribution shear forces in walls due to coupling to estimate the correct nonlinear behaviour in this system and the adequate distribution to proportionate the correct resistance and rigidity of the elements in the design to reduce the possibility of damage to the elements during an earthquake.Keywords: thin wall reinforced concrete buildings, coupling slab, rigid diaphragm, flexible diaphragm
Procedia PDF Downloads 7231 Official Game Account Analysis: Factors Influence Users' Judgments in Limited-Word Posts
Authors: Shanhua Hu
Abstract:
Social media as a critical propagandizing form of film, video games, and digital products has received substantial research attention, but there exists several critical barriers such as: (1) few studies exploring the internal and external connections of a product as part of the multimodal context that gives rise to readability and commercial return; (2) the lack of study of multimodal analysis in product’s official account of game publishers and its impact on users’ behaviors including purchase intention, social media engagement, and playing time; (3) no standardized ecologically-valid, game type-varying data can be used to study the complexity of official account’s postings within a time period. This proposed research helps to tackle these limitations in order to develop a model of readability study that is more ecologically valid, robust, and thorough. To accomplish this objective, this paper provides a more diverse dataset comprising different visual elements and messages collected from the official Twitter accounts of the Top 20 best-selling games of 2021. Video game companies target potential users through social media, a popular approach is to set up an official account to maintain exposure. Typically, major game publishers would create an official account on Twitter months before the game's release date to update on the game's development, announce collaborations, and reveal spoilers. Analyses of tweets from those official Twitter accounts would assist publishers and marketers in identifying how to efficiently and precisely deploy advertising to increase game sales. The purpose of this research is to determine how official game accounts use Twitter to attract new customers, specifically which types of messages are most effective at increasing sales. The dataset includes the number of days until the actual release date on Twitter posts, the readability of the post (Flesch Reading Ease Score, FRES), the number of emojis used, the number of hashtags, the number of followers of the mentioned users, the categorization of the posts (i.e., spoilers, collaborations, promotions), and the number of video views. The timeline of Twitter postings from official accounts will be compared to the history of pre-orders and sales figures to determine the potential impact of social media posts. This study aims to determine how the above-mentioned characteristics of official accounts' Twitter postings influence the sales of the game and to examine the possible causes of this influence. The outcome will provide researchers with a list of potential aspects that could influence people's judgments in limited-word posts. With the increased average online time, users would adapt more quickly than before in online information exchange and readings, such as the word to use sentence length, and the use of emojis or hashtags. The study on the promotion of official game accounts will not only enable publishers to create more effective promotion techniques in the future but also provide ideas for future research on the influence of social media posts with a limited number of words on consumers' purchasing decisions. Future research can focus on more specific linguistic aspects, such as precise word choice in advertising.Keywords: engagement, official account, promotion, twitter, video game
Procedia PDF Downloads 7430 An Analysis of Economical Drivers and Technical Challenges for Large-Scale Biohydrogen Deployment
Authors: Rouzbeh Jafari, Joe Nava
Abstract:
This study includes learnings from an engineering practice normally performed on large scale biohydrogen processes. If properly scale-up is done, biohydrogen can be a reliable pathway for biowaste valorization. Most of the studies on biohydrogen process development have used model feedstock to investigate process key performance indicators (KPIs). This study does not intend to compare different technologies with model feedstock. However, it reports economic drivers and technical challenges which help in developing a road map for expanding biohydrogen economy deployment in Canada. BBA is a consulting firm responsible for the design of hydrogen production projects. Through executing these projects, activity has been performed to identify, register and mitigate technical drawbacks of large-scale hydrogen production. Those learnings, in this study, have been applied to the biohydrogen process. Through data collected by a comprehensive literature review, a base case has been considered as a reference, and several case studies have been performed. Critical parameters of the process were identified and through common engineering practice (process design, simulation, cost estimate, and life cycle assessment) impact of these parameters on the commercialization risk matrix and class 5 cost estimations were reported. The process considered in this study is food waste and woody biomass dark fermentation. To propose a reliable road map to develop a sustainable biohydrogen production process impact of critical parameters was studied on the end-to-end process. These parameters were 1) feedstock composition, 2) feedstock pre-treatment, 3) unit operation selection, and 4) multi-product concept. A couple of emerging technologies also were assessed such as photo-fermentation, integrated dark fermentation, and using ultrasound and microwave to break-down feedstock`s complex matrix and increase overall hydrogen yield. To properly report the impact of each parameter KPIs were identified as 1) Hydrogen yield, 2) energy consumption, 3) secondary waste generated, 4) CO2 footprint, 5) Product profile, 6) $/kg-H2 and 5) environmental impact. The feedstock is the main parameter defining the economic viability of biohydrogen production. Through parametric studies, it was found that biohydrogen production favors feedstock with higher carbohydrates. The feedstock composition was varied, by increasing one critical element (such as carbohydrate) and monitoring KPIs evolution. Different cases were studied with diverse feedstock, such as energy crops, wastewater slug, and lignocellulosic waste. The base case process was applied to have reference KPIs values and modifications such as pretreatment and feedstock mix-and-match were implemented to investigate KPIs changes. The complexity of the feedstock is the main bottleneck in the successful commercial deployment of the biohydrogen process as a reliable pathway for waste valorization. Hydrogen yield, reaction kinetics, and performance of key unit operations highly impacted as feedstock composition fluctuates during the lifetime of the process or from one case to another. In this case, concept of multi-product becomes more reliable. In this concept, the process is not designed to produce only one target product such as biohydrogen but will have two or multiple products (biohydrogen and biomethane or biochemicals). This new approach is being investigated by the BBA team and the results will be shared in another scientific contribution.Keywords: biohydrogen, process scale-up, economic evaluation, commercialization uncertainties, hydrogen economy
Procedia PDF Downloads 10829 Redox-labeled Electrochemical Aptasensor Array for Single-cell Detection
Authors: Shuo Li, Yannick Coffinier, Chann Lagadec, Fabrizio Cleri, Katsuhiko Nishiguchi, Akira Fujiwara, Soo Hyeon Kim, Nicolas Clément
Abstract:
The need for single cell detection and analysis techniques has increased in the past decades because of the heterogeneity of individual living cells, which increases the complexity of the pathogenesis of malignant tumors. In the search for early cancer detection, high-precision medicine and therapy, the technologies most used today for sensitive detection of target analytes and monitoring the variation of these species are mainly including two types. One is based on the identification of molecular differences at the single-cell level, such as flow cytometry, fluorescence-activated cell sorting, next generation proteomics, lipidomic studies, another is based on capturing or detecting single tumor cells from fresh or fixed primary tumors and metastatic tissues, and rare circulating tumors cells (CTCs) from blood or bone marrow, for example, dielectrophoresis technique, microfluidic based microposts chip, electrochemical (EC) approach. Compared to other methods, EC sensors have the merits of easy operation, high sensitivity, and portability. However, despite various demonstrations of low limits of detection (LOD), including aptamer sensors, arrayed EC sensors for detecting single-cell have not been demonstrated. In this work, a new technique based on 20-nm-thick nanopillars array to support cells and keep them at ideal recognition distance for redox-labeled aptamers grafted on the surface. The key advantages of this technology are not only to suppress the false positive signal arising from the pressure exerted by all (including non-target) cells pushing on the aptamers by downward force but also to stabilize the aptamer at the ideal hairpin configuration thanks to a confinement effect. With the first implementation of this technique, a LOD of 13 cells (with5.4 μL of cell suspension) was estimated. In further, the nanosupported cell technology using redox-labeled aptasensors has been pushed forward and fully integrated into a single-cell electrochemical aptasensor array. To reach this goal, the LOD has been reduced by more than one order of magnitude by suppressing parasitic capacitive electrochemical signals by minimizing the sensor area and localizing the cells. Statistical analysis at the single-cell level is demonstrated for the recognition of cancer cells. The future of this technology is discussed, and the potential for scaling over millions of electrodes, thus pushing further integration at sub-cellular level, is highlighted. Despite several demonstrations of electrochemical devices with LOD of 1 cell/mL, the implementation of single-cell bioelectrochemical sensor arrays has remained elusive due to their challenging implementation at a large scale. Here, the introduced nanopillar array technology combined with redox-labeled aptamers targeting epithelial cell adhesion molecule (EpCAM) is perfectly suited for such implementation. Combining nanopillar arrays with microwells determined for single cell trapping directly on the sensor surface, single target cells are successfully detected and analyzed. This first implementation of a single-cell electrochemical aptasensor array based on Brownian-fluctuating redox species opens new opportunities for large-scale implementation and statistical analysis of early cancer diagnosis and cancer therapy in clinical settings.Keywords: bioelectrochemistry, aptasensors, single-cell, nanopillars
Procedia PDF Downloads 11428 Describing Cognitive Decline in Alzheimer's Disease via a Picture Description Writing Task
Authors: Marielle Leijten, Catherine Meulemans, Sven De Maeyer, Luuk Van Waes
Abstract:
For the diagnosis of Alzheimer's disease (AD), a large variety of neuropsychological tests are available. In some of these tests, linguistic processing - both oral and written - is an important factor. Language disturbances might serve as a strong indicator for an underlying neurodegenerative disorder like AD. However, the current diagnostic instruments for language assessment mainly focus on product measures, such as text length or number of errors, ignoring the importance of the process that leads to written or spoken language production. In this study, it is our aim to describe and test differences between cognitive and impaired elderly on the basis of a selection of writing process variables (inter- and intrapersonal characteristics). These process variables are mainly related to pause times, because the number, length, and location of pauses have proven to be an important indicator of the cognitive complexity of a process. Method: Participants that were enrolled in our research were chosen on the basis of a number of basic criteria necessary to collect reliable writing process data. Furthermore, we opted to match the thirteen cognitively impaired patients (8 MCI and 5 AD) with thirteen cognitively healthy elderly. At the start of the experiment, participants were each given a number of tests, such as the Mini-Mental State Examination test (MMSE), the Geriatric Depression Scale (GDS), the forward and backward digit span and the Edinburgh Handedness Inventory (EHI). Also, a questionnaire was used to collect socio-demographic information (age, gender, eduction) of the subjects as well as more details on their level of computer literacy. The tests and questionnaire were followed by two typing tasks and two picture description tasks. For the typing tasks participants had to copy (type) characters, words and sentences from a screen, whereas the picture description tasks each consisted of an image they had to describe in a few sentences. Both the typing and the picture description tasks were logged with Inputlog, a keystroke logging tool that allows us to log and time stamp keystroke activity to reconstruct and describe text production processes. The main rationale behind keystroke logging is that writing fluency and flow reveal traces of the underlying cognitive processes. This explains the analytical focus on pause (length, number, distribution, location, etc.) and revision (number, type, operation, embeddedness, location, etc.) characteristics. As in speech, pause times are seen as indexical of cognitive effort. Results. Preliminary analysis already showed some promising results concerning pause times before, within and after words. For all variables, mixed effects models were used that included participants as a random effect and MMSE scores, GDS scores and word categories (such as determiners and nouns) as a fixed effect. For pause times before and after words cognitively impaired patients paused longer than healthy elderly. These variables did not show an interaction effect between the group participants (cognitively impaired or healthy elderly) belonged to and word categories. However, pause times within words did show an interaction effect, which indicates pause times within certain word categories differ significantly between patients and healthy elderly.Keywords: Alzheimer's disease, keystroke logging, matching, writing process
Procedia PDF Downloads 36527 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain
Authors: Zachary Blanks, Solomon Sonya
Abstract:
Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection
Procedia PDF Downloads 29026 The Roots of Amazonia’s Droughts and Floods: Complex Interactions of Pacific and Atlantic Sea-Surface Temperatures
Authors: Rosimeire Araújo Silva, Philip Martin Fearnside
Abstract:
Extreme droughts and floods in the Amazon have serious consequences for natural ecosystems and the human population in the region. The frequency of these events has increased in recent years, and projections of climate change predict greater frequency and intensity of these events. Understanding the links between these extreme events and different patterns of sea surface temperature in the Atlantic and Pacific Oceans is essential, both to improve the modeling of climate change and its consequences and to support efforts of adaptation in the region. The relationship between sea temperatures and events in the Amazon is much more complex than is usually assumed in climatic models. Warming and cooling of different parts of the oceans, as well as the interaction between simultaneous temperature changes in different parts of each ocean and between the two oceans, have specific consequences for the Amazon, with effects on precipitation that vary in different parts of the region. Simplistic generalities, such as the association between El Niño events and droughts in the Amazon, do not capture this complexity. We investigated the variability of Sea Surface Temperature (SST) in the Tropical Pacific Ocean during the period 1950-2022, using Empirical Orthogonal Functions (FOE), spectral analysis coherence and wavelet phase. The two were identified as the main modes of variability, which explain about 53,9% and 13,3%, respectively, of the total variance of the data. The spectral and coherence analysis and wavelets phase showed that the first selected mode represents the warming in the central part of the Pacific Ocean (the “Central El Niño”), while the second mode represents warming in the eastern part of the Pacific (the “Eastern El Niño The effects of the 1982-1983 and 1976-1977 El Niño events in the Amazon, although both events were characterized by an increase in sea surface temperatures in the Equatorial Pacific, the impact on rainfall in the Amazon was distinct. In the rainy season, from December to March, the sub-basins of the Japurá, Jutaí, Jatapu, Tapajós, Trombetas and Xingu rivers were the regions that showed the greatest reductions in rainfall associated with El Niño Central (1982-1983), while the sub-basins of the Javari, Purus, Negro and Madeira rivers had the most pronounced reductions in the year of Eastern El Niño (1976-1977). In the transition to the dry season, in April, the greatest reductions were associated with the Eastern El Niño year for the majority of the study region, with the exception only of the sub-basins of the Madeira, Trombetas and Xingu rivers, which had their associated reductions to Central El Niño. In the dry season from July to September, the sub-basins of the Japurá Jutaí Jatapu Javari Trombetas and Madeira rivers were the rivers that showed the greatest reductions in rainfall associated with El Niño Central, while the sub-basins of the Tapajós Purus Negro and Xingu rivers had the most pronounced reductions. In the Eastern El Niño year this season. In this way, it is possible to conclude that the Central (Eastern) El Niño controlled the reductions in soil moisture in the dry (rainy) season for all sub-basins shown in this study. Extreme drought events associated with these meteorological phenomena can lead to a significant increase in the occurrence of forest fires. These fires have a devastating impact on Amazonian vegetation, resulting in the irreparable loss of biodiversity and the release of large amounts of carbon stored in the forest, contributing to the increase in the greenhouse effect and global climate change.Keywords: sea surface temperature, variability, climate, Amazon
Procedia PDF Downloads 6325 Lack of Regulation Leads to Complexity: A Case Study of the Free Range Chicken Meat Sector in the Western Cape, South Africa
Authors: A. Coetzee, C. F. Kelly, E. Even-Zahav
Abstract:
Dominant approaches to livestock production are harmful to the environment, human health and animal welfare, yet global meat consumption is rising. Sustainable alternative production approaches are therefore urgently required, and ‘free range’ is the main alternative for chicken meat offered in South Africa (and globally). Although the South African Poultry Association provides non-binding guidelines, there is a lack of formal definition and regulation of free range chicken production, meaning it is unclear what this alternative entails and if it is consistently practised (a trend observed globally). The objective of this exploratory qualitative case study is therefore to investigate who and what determines free range chicken. The case study, conducted from a social constructivist worldview, uses semi-structured interviews, photographs and document analysis to collect data. Interviews are conducted with those involved with bringing free range chicken to the market - farmers, chefs, retailers, and regulators. Data is analysed using thematic analysis to establish dominant patterns in the data. The five major themes identified (based on prevalence in data and on achieving the research objective) are: 1) free range means a bird reared with good animal welfare in mind, 2) free range means quality meat, 3) free range means a profitable business, 4) free range is determined by decision makers or by access to markets, and 5) free range is coupled with concerns about the lack of regulation. Unpacking the findings in the context of the literature reveals who and what determines free range. The research uncovers wide-ranging interpretations of ‘free range’, driven by the absence of formal regulation for free range chicken practices and the lack of independent private certification. This means that the term ‘free range’ is socially constructed, thus varied and complex. The case study also shows that whether chicken meat is free range is generally determined by those who have access to markets. Large retailers claim adherence to the internationally recognised Five Freedoms, also include in the South African Poultry Association Code of Good Practice, which others in the sector say are too broad to be meaningful. Producers describe animal welfare concerns as the main driver for how they practice/view free range production, yet these interpretations vary. An additional driver is a focus on human health, which participants achieve mainly through the use of antibiotic-free feed, resulting in what participants regard as higher quality meat. The participants are also strongly driven by business imperatives, with most stating that free range chicken should carry a higher price than conventionally-reared chicken due to increased production costs. Recommendations from this study focus on, inter alia, a need to understand consumers’ perspectives on free range chicken, given that those in the sector claim they are responding to consumer demand, and conducting environmental research such as life cycle assessment studies to establish the true (environmental) sustainability of free range production. At present, it seems the sector mostly responds to social sustainability: human health and animal welfare.Keywords: chicken meat production, free range, socially constructed, sustainability
Procedia PDF Downloads 15524 Design Challenges for Severely Skewed Steel Bridges
Authors: Muna Mitchell, Akshay Parchure, Krishna Singaraju
Abstract:
There is an increasing need for medium- to long-span steel bridges with complex geometry due to site restrictions in developed areas. One of the solutions to grade separations in congested areas is to use longer spans on skewed supports that avoid at-grade obstructions limiting impacts to the foundation. Where vertical clearances are also a constraint, continuous steel girders can be used to reduce superstructure depths. Combining continuous long steel spans on severe skews can resolve the constraints at a cost. The behavior of skewed girders is challenging to analyze and design with subsequent complexity during fabrication and construction. As a part of a corridor improvement project, Walter P Moore designed two 1700-foot side-by-side bridges carrying four lanes of traffic in each direction over a railroad track. The bridges consist of prestressed concrete girder approach spans and three-span continuous steel plate girder units. The roadway design added complex geometry to the bridge with horizontal and vertical curves combined with superelevation transitions within the plate girder units. The substructure at the steel units was skewed approximately 56 degrees to satisfy the existing railroad right-of-way requirements. A horizontal point of curvature (PC) near the end of the steel units required the use flared girders and chorded slab edges. Due to the flared girder geometry, the cross-frame spacing in each bay is unique. Staggered cross frames were provided based on AASHTO LRFD and NCHRP guidelines for high skew steel bridges. Skewed steel bridges develop significant forces in the cross frames and rotation in the girder websdue to differential displacements along the girders under dead and live loads. In addition, under thermal loads, skewed steel bridges expand and contract not along the alignment parallel to the girders but along the diagonal connecting the acute corners, resulting in horizontal displacement both along and perpendicular to the girders. AASHTO LRFD recommends a 95 degree Fahrenheit temperature differential for the design of joints and bearings. The live load and the thermal loads resulted in significant horizontal forces and rotations in the bearings that necessitated the use of HLMR bearings. A unique bearing layout was selected to minimize the effect of thermal forces. The span length, width, skew, and roadway geometry at the bridges also required modular bridge joint systems (MBJS) with inverted-T bent caps to accommodate movement in the steel units. 2D and 3D finite element analysis models were developed to accurately determine the forces and rotations in the girders, cross frames, and bearings and to estimate thermal displacements at the joints. This paper covers the decision-making process for developing the framing plan, bearing configurations, joint type, and analysis models involved in the design of the high-skew three-span continuous steel plate girder bridges.Keywords: complex geometry, continuous steel plate girders, finite element structural analysis, high skew, HLMR bearings, modular joint
Procedia PDF Downloads 19123 Unveiling the Dynamics of Preservice Teachers’ Engagement with Mathematical Modeling through Model Eliciting Activities: A Comprehensive Exploration of Acceptance and Resistance Towards Modeling and Its Pedagogy
Authors: Ozgul Kartal, Wade Tillett, Lyn D. English
Abstract:
Despite its global significance in curricula, mathematical modeling encounters persistent disparities in recognition and emphasis within regular mathematics classrooms and teacher education across countries with diverse educational and cultural traditions, including variations in the perceived role of mathematical modeling. Over the past two decades, increased attention has been given to the integration of mathematical modeling into national curriculum standards in the U.S. and other countries. Therefore, the mathematics education research community has dedicated significant efforts to investigate various aspects associated with the teaching and learning of mathematical modeling, primarily focusing on exploring the applicability of modeling in schools and assessing students', teachers', and preservice teachers' (PTs) competencies and engagement in modeling cycles and processes. However, limited attention has been directed toward examining potential resistance hindering teachers and PTs from effectively implementing mathematical modeling. This study focuses on how PTs, without prior modeling experience, resist and/or embrace mathematical modeling and its pedagogy as they learn about models and modeling perspectives, navigate the modeling process, design and implement their modeling activities and lesson plans, and experience the pedagogy enabling modeling. Model eliciting activities (MEAs) were employed due to their high potential to support the development of mathematical modeling pedagogy. The mathematical modeling module was integrated into a mathematics methods course to explore how PTs embraced or resisted mathematical modeling and its pedagogy. The module design included reading, reflecting, engaging in modeling, assessing models, creating a modeling task (MEA), and designing a modeling lesson employing an MEA. Twelve senior undergraduate students participated, and data collection involved video recordings, written prompts, lesson plans, and reflections. An open coding analysis revealed acceptance and resistance toward teaching mathematical modeling. The study identified four overarching themes, including both acceptance and resistance: pedagogy, affordance of modeling (tasks), modeling actions, and adjusting modeling. In the category of pedagogy, PTs displayed acceptance based on potential pedagogical benefits and resistance due to various concerns. The affordance of modeling (tasks) category emerged from instances when PTs showed acceptance or resistance while discussing the nature and quality of modeling tasks, often debating whether modeling is considered mathematics. PTs demonstrated both acceptance and resistance in their modeling actions, engaging in modeling cycles as students and designing/implementing MEAs as teachers. The adjusting modeling category captured instances where PTs accepted or resisted maintaining the qualities and nature of the modeling experience or converted modeling into a typical structured mathematics experience for students. While PTs displayed a mix of acceptance and resistance in their modeling actions, limitations were observed in embracing complexity and adhering to model principles. The study provides valuable insights into the challenges and opportunities of integrating mathematical modeling into teacher education, emphasizing the importance of addressing pedagogical concerns and providing support for effective implementation. In conclusion, this research offers a comprehensive understanding of PTs' engagement with modeling, advocating for a more focused discussion on the distinct nature and significance of mathematical modeling in the broader curriculum to establish a foundation for effective teacher education programs.Keywords: mathematical modeling, model eliciting activities, modeling pedagogy, secondary teacher education
Procedia PDF Downloads 6322 Enhancing the Implementation Strategy of Simultaneous Operations (SIMOPS) for the Major Turnaround at Pertamina Plaju Refinery
Authors: Fahrur Rozi, Daniswara Krisna Prabatha, Latief Zulfikar Chusaini
Abstract:
Amidst the backdrop of Pertamina Plaju Refinery, which stands as the oldest and historically less technologically advanced among Pertamina's refineries, lies a unique challenge. Originally integrating facilities established by Shell in 1904 and Stanvac (originally Standard Oil) in 1926, the primary challenge at Plaju Refinery does not solely revolve around complexity; instead, it lies in ensuring reliability, considering its operational history of over a century. After centuries of existence, Plaju Refinery has never undergone a comprehensive major turnaround encompassing all its units. The usual practice involves partial turnarounds that are sequentially conducted across its primary, secondary, and tertiary units (utilities and offsite). However, a significant shift is on the horizon. In the Q-IV of 2023, the refinery embarks on its first-ever major turnaround since its establishment. This decision was driven by the alignment of maintenance timelines across various units. Plaju Refinery's major turnaround was scheduled for October-November 2023, spanning 45 calendar days, with the objective of enhancing the operational reliability of all refinery units. The extensive job list for this turnaround encompasses 1583 tasks across 18 units/areas, involving approximately 9000 contracted workers. In this context, the Strategy of Simultaneous Operations (SIMOPS) execution emerges as a pivotal tool to optimize time efficiency and ensure safety. A Hazard Effect Management Process (HEMP) has been employed to assess the risk ratings of each task within the turnaround. Out of the tasks assessed, 22 are deemed high-risk and necessitate mitigation. The SIMOPS approach serves as a preventive measure against potential incidents. It is noteworthy that every turnaround period at Pertamina Plaju Refinery involves SIMOPS-related tasks. In this context, enhancing the implementation strategy of "Simultaneous Operations (SIMOPS)" becomes imperative to minimize the occurrence of incidents. At least four improvements have been introduced in the enhancement process for the major turnaround at Refinery Plaju. The first improvement involves conducting systematic risk assessment and potential hazard mitigation studies for SIMOPS tasks before task execution, as opposed to the previous on-site approach. The second improvement includes the completion of SIMOPS Job Mitigation and Work Matrices Sheets, which was often neglected in the past. The third improvement emphasizes comprehensive awareness to workers/contractors regarding potential hazards and mitigation strategies for SIMOPS tasks before and during the major turnaround. The final improvement is the introduction of a daily program for inspecting and observing work in progress for SIMOPS tasks. Prior to these improvements, there was no established program for monitoring ongoing activities related to SIMOPS tasks during the turnaround. This study elucidates the steps taken to enhance SIMOPS within Pertamina, drawing from the experiences of Plaju Refinery as a guide. A real actual case study will be provided from our experience in the operational unit. In conclusion, these efforts are essential for the success of the first-ever major turnaround at Plaju Refinery, with the SIMOPS strategy serving as a central component. Based on these experiences, enhancements have been made to Pertamina's official Internal Guidelines for Executing SIMOPS Risk Mitigation, benefiting all Pertamina units.Keywords: process safety management, turn around, oil refinery, risk assessment
Procedia PDF Downloads 7221 Design and Fabrication of AI-Driven Kinetic Facades with Soft Robotics for Optimized Building Energy Performance
Authors: Mohammadreza Kashizadeh, Mohammadamin Hashemi
Abstract:
This paper explores a kinetic building facade designed for optimal energy capture and architectural expression. The system integrates photovoltaic panels with soft robotic actuators for precise solar tracking, resulting in enhanced electricity generation compared to static facades. Driven by the growing interest in dynamic building envelopes, the exploration of facade systems are necessitated. Increased energy generation and regulation of energy flow within buildings are potential benefits offered by integrating photovoltaic (PV) panels as kinetic elements. However, incorporating these technologies into mainstream architecture presents challenges due to the complexity of coordinating multiple systems. To address this, the design leverages soft robotic actuators, known for their compliance, resilience, and ease of integration. Additionally, the project investigates the potential for employing Large Language Models (LLMs) to streamline the design process. The research methodology involved design development, material selection, component fabrication, and system assembly. Grasshopper (GH) was employed within the digital design environment for parametric modeling and scripting logic, and an LLM was experimented with to generate Python code for the creation of a random surface with user-defined parameters. Various techniques, including casting, Three-dimensional 3D printing, and laser cutting, were utilized to fabricate physical components. A modular assembly approach was adopted to facilitate installation and maintenance. A case study focusing on the application of this facade system to an existing library building at Polytechnic University of Milan is presented. The system is divided into sub-frames to optimize solar exposure while maintaining a visually appealing aesthetic. Preliminary structural analyses were conducted using Karamba3D to assess deflection behavior and axial loads within the cable net structure. Additionally, Finite Element (FE) simulations were performed in Abaqus to evaluate the mechanical response of the soft robotic actuators under pneumatic pressure. To validate the design, a physical prototype was created using a mold adapted for a 3D printer's limitations. Casting Silicone Rubber Sil 15 was used for its flexibility and durability. The 3D-printed mold components were assembled, filled with the silicone mixture, and cured. After demolding, nodes and cables were 3D-printed and connected to form the structure, demonstrating the feasibility of the design. This work demonstrates the potential of soft robotics and Artificial Intelligence (AI) for advancements in sustainable building design and construction. The project successfully integrates these technologies to create a dynamic facade system that optimizes energy generation and architectural expression. While limitations exist, this approach paves the way for future advancements in energy-efficient facade design. Continued research efforts will focus on cost reduction, improved system performance, and broader applicability.Keywords: artificial intelligence, energy efficiency, kinetic photovoltaics, pneumatic control, soft robotics, sustainable building
Procedia PDF Downloads 2820 Developing and integrated Clinical Risk Management Model
Authors: Mohammad H. Yarmohammadian, Fatemeh Rezaei
Abstract:
Introduction: Improving patient safety in health systems is one of the main priorities in healthcare systems, so clinical risk management in organizations has become increasingly significant. Although several tools have been developed for clinical risk management, each has its own limitations. Aims: This study aims to develop a comprehensive tool that can complete the limitations of each risk assessment and management tools with the advantage of other tools. Methods: Procedure was determined in two main stages included development of an initial model during meetings with the professors and literature review, then implementation and verification of final model. Subjects and Methods: This study is a quantitative − qualitative research. In terms of qualitative dimension, method of focus groups with inductive approach is used. To evaluate the results of the qualitative study, quantitative assessment of the two parts of the fourth phase and seven phases of the research was conducted. Purposive and stratification sampling of various responsible teams for the selected process was conducted in the operating room. Final model verified in eight phases through application of activity breakdown structure, failure mode and effects analysis (FMEA), healthcare risk priority number (RPN), root cause analysis (RCA), FT, and Eindhoven Classification model (ECM) tools. This model has been conducted typically on patients admitted in a day-clinic ward of a public hospital for surgery in October 2012 to June. Statistical Analysis Used: Qualitative data analysis was done through content analysis and quantitative analysis done through checklist and edited RPN tables. Results: After verification the final model in eight-step, patient's admission process for surgery was developed by focus discussion group (FDG) members in five main phases. Then with adopted methodology of FMEA, 85 failure modes along with its causes, effects, and preventive capabilities was set in the tables. Developed tables to calculate RPN index contain three criteria for severity, two criteria for probability, and two criteria for preventability. Tree failure modes were above determined significant risk limitation (RPN > 250). After a 3-month period, patient's misidentification incidents were the most frequent reported events. Each RPN criterion of misidentification events compared and found that various RPN number for tree misidentification reported events could be determine against predicted score in previous phase. Identified root causes through fault tree categorized with ECM. Wrong side surgery event was selected by focus discussion group to purpose improvement action. The most important causes were lack of planning for number and priority of surgical procedures. After prioritization of the suggested interventions, computerized registration system in health information system (HIS) was adopted to prepare the action plan in the final phase. Conclusion: Complexity of health care industry requires risk managers to have a multifaceted vision. Therefore, applying only one of retrospective or prospective tools for risk management does not work and each organization must provide conditions for potential application of these methods in its organization. The results of this study showed that the integrated clinical risk management model can be used in hospitals as an efficient tool in order to improve clinical governance.Keywords: failure modes and effective analysis, risk management, root cause analysis, model
Procedia PDF Downloads 24819 21st-Century Middlebrow Film: A Critical Examination of the Spectator Experience in Malayalam Film
Authors: Anupama A. P.
Abstract:
The Malayalam film industry, known as Mollywood, has a rich tradition of storytelling and cultural significance within Indian cinema. Middlebrow films have emerged as a distinct influential category, particularly in the 1980s, with directors like K.G. George, who engaged with female subjectivity and drew inspiration from the ‘women’s cinema’ of the 1950s and 1960s. In recent decades, particularly post-2010, the industry has transformed significantly with a new generation of filmmakers diverging from melodrama and new wave of the past, incorporating advanced technology and modern content. This study examines the evolution and impact of Malayalam middlebrow cinema in the 21st century, focusing on post-2000 films and their influence on contemporary spectator experiences. These films appeal to a wide range of audiences without compromising on their artistic integrity, tackling social issues and personal dramas with thematic and narrative complexity. Historically, middlebrow films in Malayalam cinema have portrayed realism and addressed the socio-political climate of Kerala, blending realism with reflexivity and moving away from traditional sentimentality. This shift is evident in the new generation of Malayalam films, which present a global representation of characters and a modern treatment of individuals. To provide a comprehensive understanding of this evolution, the study analyzes a diverse selection of films such as Kerala Varma Pazhassi Raja (2009), Drishyam (2013), Maheshinte Prathikaaram (2016), Take Off (2017), and Thondimuthalum Driksakshiyum (2017) and Virus (2019) illustrating the broad thematic range and innovative narrative techniques characteristic of this genre. These films exemplify how middlebrow cinema continues to evolve, adapting to changing societal contexts and audience expectations. This research employs a theoretical methodology, drawing on cultural studies and audience reception theory, utilizing frameworks such as Bordwell’s narrative theory, Deleuze’s concept of deterritorialization, and Hall’s encoding/decoding model to analyze the changes in Malayalam middlebrow cinema and interpret the storytelling methods, spectator experience, and audience reception of these films. The findings indicate that Malayalam middlebrow cinema post-2010 offers a spectator experience that is both intellectually stimulating and broadly appealing. This study highlights the critical role of middlebrow cinema in reflecting and shaping societal values, making it a significant cultural artefact within the broader context of Indian and global cinema. By bridging entertainment with thought-provoking narratives, these films engage audiences and contribute to wider cultural discourse, making them pivotal in contemporary cinematic landscapes. To conclude, this study highlights the importance of Malayalam middle-brow cinema in influencing contemporary cinematic tastes. The nuanced and approachable narratives of post-2010 films are posited to assume an increasingly pivotal role in the future of Malayalam cinema. By providing a deeper understanding of Malayalam middlebrow cinema and its societal implications, this study enriches theoretical discourse, promotes regional cinema, and offers valuable insights into contemporary spectator experiences and the future trajectory of Malayalam cinema.Keywords: Malayalam cinema, middlebrow cinema, spectator experience, audience reception, deterritorialization
Procedia PDF Downloads 3218 Structured Cross System Planning and Control in Modular Production Systems by Using Agent-Based Control Loops
Authors: Simon Komesker, Achim Wagner, Martin Ruskowski
Abstract:
In times of volatile markets with fluctuating demand and the uncertainty of global supply chains, flexible production systems are the key to an efficient implementation of a desired production program. In this publication, the authors present a holistic information concept taking into account various influencing factors for operating towards the global optimum. Therefore, a strategy for the implementation of multi-level planning for a flexible, reconfigurable production system with an alternative production concept in the automotive industry is developed. The main contribution of this work is a system structure mixing central and decentral planning and control evaluated in a simulation framework. The information system structure in current production systems in the automotive industry is rigidly hierarchically organized in monolithic systems. The production program is created rule-based with the premise of achieving uniform cycle time. This program then provides the information basis for execution in subsystems at the station and process execution level. In today's era of mixed-(car-)model factories, complex conditions and conflicts arise in achieving logistics, quality, and production goals. There is no provision for feedback loops of results from the process execution level (resources) and process supporting (quality and logistics) systems and reconsideration in the planning systems. To enable a robust production flow, the complexity of production system control is artificially reduced by the line structure and results, for example in material-intensive processes (buffers and safety stocks - two container principle also for different variants). The limited degrees of freedom of line production have produced the principle of progress figure control, which results in one-time sequencing, sequential order release, and relatively inflexible capacity control. As a result, modularly structured production systems such as modular production according to known approaches with more degrees of freedom are currently difficult to represent in terms of information technology. The remedy is an information concept that supports cross-system and cross-level information processing for centralized and decentralized decision-making. Through an architecture of hierarchically organized but decoupled subsystems, the paradigm of hybrid control is used, and a holonic manufacturing system is offered, which enables flexible information provisioning and processing support. In this way, the influences from quality, logistics, and production processes can be linked holistically with the advantages of mixed centralized and decentralized planning and control. Modular production systems also require modularly networked information systems with semi-autonomous optimization for a robust production flow. Dynamic prioritization of different key figures between subsystems should lead the production system to an overall optimum. The tasks and goals of quality, logistics, process, resource, and product areas in a cyber-physical production system are designed as an interconnected multi-agent-system. The result is an alternative system structure that executes centralized process planning and decentralized processing. An agent-based manufacturing control is used to enable different flexibility and reconfigurability states and manufacturing strategies in order to find optimal partial solutions of subsystems, that lead to a near global optimum for hybrid planning. This allows a robust near to plan execution with integrated quality control and intralogistics.Keywords: holonic manufacturing system, modular production system, planning, and control, system structure
Procedia PDF Downloads 16817 Large Scale Method to Assess the Seismic Vulnerability of Heritage Buidings: Modal Updating of Numerical Models and Vulnerability Curves
Authors: Claire Limoge Schraen, Philippe Gueguen, Cedric Giry, Cedric Desprez, Frédéric Ragueneau
Abstract:
Mediterranean area is characterized by numerous monumental or vernacular masonry structures illustrating old ways of build and live. Those precious buildings are often poorly documented, present complex shapes and loadings, and are protected by the States, leading to legal constraints. This area also presents a moderate to high seismic activity. Even moderate earthquakes can be magnified by local site effects and cause collapse or significant damage. Moreover the structural resistance of masonry buildings, especially when less famous or located in rural zones has been generally lowered by many factors: poor maintenance, unsuitable restoration, ambient pollution, previous earthquakes. Recent earthquakes prove that any damage to these architectural witnesses to our past is irreversible, leading to the necessity of acting preventively. This means providing preventive assessments for hundreds of structures with no or few documents. In this context we want to propose a general method, based on hierarchized numerical models, to provide preliminary structural diagnoses at a regional scale, indicating whether more precise investigations and models are necessary for each building. To this aim, we adapt different tools, being developed such as photogrammetry or to be created such as a preprocessor starting from pictures to build meshes for a FEM software, in order to allow dynamic studies of the buildings of the panel. We made an inventory of 198 baroque chapels and churches situated in the French Alps. Then their structural characteristics have been determined thanks field surveys and the MicMac photogrammetric software. Using structural criteria, we determined eight types of churches and seven types for chapels. We studied their dynamical behavior thanks to CAST3M, using EC8 spectrum and accelerogramms of the studied zone. This allowed us quantifying the effect of the needed simplifications in the most sensitive zones and choosing the most effective ones. We also proposed threshold criteria based on the observed damages visible in the in situ surveys, old pictures and Italian code. They are relevant in linear models. To validate the structural types, we made a vibratory measures campaign using vibratory ambient noise and velocimeters. It also allowed us validating this method on old masonry and identifying the modal characteristics of 20 churches. Then we proceeded to a dynamic identification between numerical and experimental modes. So we updated the linear models thanks to material and geometrical parameters, often unknown because of the complexity of the structures and materials. The numerically optimized values have been verified thanks to the measures we made on the masonry components in situ and in laboratory. We are now working on non-linear models redistributing the strains. So we validate the damage threshold criteria which we use to compute the vulnerability curves of each defined structural type. Our actual results show a good correlation between experimental and numerical data, validating the final modeling simplifications and the global method. We now plan to use non-linear analysis in the critical zones in order to test reinforcement solutions.Keywords: heritage structures, masonry numerical modeling, seismic vulnerability assessment, vibratory measure
Procedia PDF Downloads 49216 Towards Achieving Total Decent Work: Occupational Safety and Health Issues, Problems and Concerns of Filipino Domestic Workers
Authors: Ronahlee Asuncion
Abstract:
The nature of their work and employment relationship make domestic workers easy prey to abuse, maltreatment, and exploitation. Considering their plight, this research was conceptualized and examined the: a) level of awareness of Filipino domestic workers on occupational safety and health (OSH); b) their issues/problems/concerns on OSH; c) their intervention strategies at work to address OSH related issues/problems/concerns; d) issues/problems/concerns of government, employers, and non-government organizations with regard to implementation of OSH to Filipino domestic workers; e) the role of government, employers and non-government organizations to help Filipino domestic workers address OSH related issues/problems/concerns; and f) the necessary policy amendments/initiatives/programs to address OSH related issues/problems/concerns of Filipino domestic workers. The study conducted a survey using non-probability sampling, two focus group discussions, two group interviews, and fourteen face-to-face interviews. These were further supplemented with an email correspondence to a key informant based in another country. Books, journals, magazines, and relevant websites further substantiated and enriched data of the research. Findings of the study point to the fact that domestic workers have low level of awareness on OSH because of poor information drive, fragmented implementation of the Domestic Workers Act, inactive campaign at the barangay level, weakened advocacy for domestic workers, absence of law on OSH for domestic workers, and generally low safety culture in the country among others. Filipino domestic workers suffer from insufficient rest, long hours of work, heavy workload, occupational stress, poor accommodation, insufficient hours of sleep, deprivation of day off, accidents and injuries such as cuts, burns, slipping, stumbling, electrical grounding, and fire, verbal, physical and sexual abuses, lack of medical assistance, none provision of personal protective equipment (PPE), absence of knowledge on the proper way of lifting, working at heights, and insufficient food provision. They also suffer from psychological problems because of separation from one’s family, limited mobility in the household where they work, injuries and accidents from using advanced home appliances and taking care of pets, low self-esteem, ergonomic problems, the need to adjust to all household members who have various needs and demands, inability to voice their complaints, drudgery of work, and emotional stress. With regard to illness or health problems, they commonly experience leg pains, back pains, and headaches. In the absence of intervention programs like those offered in the formal employment set up, domestic workers resort to praying, turn to family, relatives and friends for social and emotional support, connect with them through social media like Facebook which also serve as a means of entertainment to them, talk to their employer, and just try to be optimistic about their situation. Promoting OSH for domestic workers is very challenging and complicated because of interrelated factors such as cultural, knowledge, attitudinal, relational, social, resource, economic, political, institutional and legal problems. This complexity necessitates using a holistic and integrated approach as this is not a problem requiring simple solutions. With this recognition comes the full understanding that its success involves the action and cooperation of all duty bearers in attaining decent work for domestic workers.Keywords: decent work, Filipino domestic workers, occupational safety and health, working conditions
Procedia PDF Downloads 26015 Discovering Causal Structure from Observations: The Relationships between Technophile Attitude, Users Value and Use Intention of Mobility Management Travel App
Authors: Aliasghar Mehdizadeh Dastjerdi, Francisco Camara Pereira
Abstract:
The increasing complexity and demand of transport services strains transportation systems especially in urban areas with limited possibilities for building new infrastructure. The solution to this challenge requires changes of travel behavior. One of the proposed means to induce such change is multimodal travel apps. This paper describes a study of the intention to use a real-time multi-modal travel app aimed at motivating travel behavior change in the Greater Copenhagen Region (Denmark) toward promoting sustainable transport options. The proposed app is a multi-faceted smartphone app including both travel information and persuasive strategies such as health and environmental feedback, tailoring travel options, self-monitoring, tunneling users toward green behavior, social networking, nudging and gamification elements. The prospective for mobility management travel apps to stimulate sustainable mobility rests not only on the original and proper employment of the behavior change strategies, but also on explicitly anchoring it on established theoretical constructs from behavioral theories. The theoretical foundation is important because it positively and significantly influences the effectiveness of the system. However, there is a gap in current knowledge regarding the study of mobility-management travel app with support in behavioral theories, which should be explored further. This study addresses this gap by a social cognitive theory‐based examination. However, compare to conventional method in technology adoption research, this study adopts a reverse approach in which the associations between theoretical constructs are explored by Max-Min Hill-Climbing (MMHC) algorithm as a hybrid causal discovery method. A technology-use preference survey was designed to collect data. The survey elicited different groups of variables including (1) three groups of user’s motives for using the app including gain motives (e.g., saving travel time and cost), hedonic motives (e.g., enjoyment) and normative motives (e.g., less travel-related CO2 production), (2) technology-related self-concepts (i.e. technophile attitude) and (3) use Intention of the travel app. The questionnaire items led to the formulation of causal relationships discovery to learn the causal structure of the data. Causal relationships discovery from observational data is a critical challenge and it has applications in different research fields. The estimated causal structure shows that the two constructs of gain motives and technophilia have a causal effect on adoption intention. Likewise, there is a causal relationship from technophilia to both gain and hedonic motives. In line with the findings of the prior studies, it highlights the importance of functional value of the travel app as well as technology self-concept as two important variables for adoption intention. Furthermore, the results indicate the effect of technophile attitude on developing gain and hedonic motives. The causal structure shows hierarchical associations between the three groups of user’s motive. They can be explained by “frustration-regression” principle according to Alderfer's ERG (Existence, Relatedness and Growth) theory of needs meaning that a higher level need remains unfulfilled, a person may regress to lower level needs that appear easier to satisfy. To conclude, this study shows the capability of causal discovery methods to learn the causal structure of theoretical model, and accordingly interpret established associations.Keywords: travel app, behavior change, persuasive technology, travel information, causality
Procedia PDF Downloads 14114 Sustainable Urban Regenaration the New Vocabulary and the Timless Grammar of the Urban Tissue
Authors: Ruth Shapira
Abstract:
Introduction: The rapid urbanization of the last century confronts planners, regulatory bodies, developers and most of all the public with seemingly unsolved conflicts regarding values, capital, and wellbeing of the built and un-built urban space. There is an out of control change of scale of the urban form and of the rhythm of the urban life which has known no significant progress in the last 2-3 decades despite the on-growing urban population. It is the objective of this paper to analyze some of these fundamental issues through the case study of a relatively small town in the center of Israel (Kiryat-Ono, 36,000 inhabitants), unfold the deep structure of qualities versus disruptors, present some cure that we have developed to bridge over and humbly suggest a practice that may bring about a sustainable new urban environment based on timeless values of the past, an approach that can be generic for similar cases. Basic Methodologies:The object, the town of Kiryat Ono, shall be experimented upon in a series of four action processes: De-composition, Re-composition, the Centering process and, finally, Controlled Structural Disintegration. Each stage will be based on facts, analysis of previous multidisciplinary interventions on various layers – and the inevitable reaction of the OBJECT, leading to the conclusion based on innovative theoretical and practical methods that we have developed and that we believe are proper for the open ended network, setting the rules for the contemporary urban society to cluster by – thus – a new urban vocabulary based on the old structure of times passed. The Study: Kiryat Ono, was founded 70 years ago as an agricultural settlement and rapidly turned into an urban entity. In spite the massive intensification, the original DNA of the old small town was still deeply embedded, mostly in the quality of the public space and in the sense of clustered communities. In the past 20 years, the recent demand for housing has been addressed to on the national level with recent master plans and urban regeneration policies mostly encouraging individual economic initiatives. Unfortunately, due to the obsolete existing planning platform the present urban renewal is characterized by pressure of developers, a dramatic change in building scale and widespread disintegration of the existing urban and social tissue.Our office was commissioned to conceptualize two master plans for the two contradictory processes of Kiryat Ono’s future: intensification and conservation. Following a comprehensive investigation into the deep structures and qualities of the existing town, we developed a new vocabulary of conservation terms thus redefying the sense of PLACE. The main challenge was to create master plans that should offer a regulatory basis to the accelerated and sporadic development providing for the public good and preserving the characteristics of the place consisting of a tool box of design guidelines that will have the ability to reorganize space along the time axis in a sustainable way. In conclusion: The system of rules that we have developed can generate endless possible patterns making sure that at each implementation fragment an event is created, and a better place is revealed. It takes time and perseverance but it seems to be the way to provide a healthy and sustainable framework for the accelerated urbanization of our chaotic present.Keywords: sustainable urban design, intensification, emergent urban patterns, sustainable housing, compact urban neighborhoods, sustainable regeneration, restoration, complexity, uncertainty, need for change, implications of legislation on local planning
Procedia PDF Downloads 38813 Continuity Through Best Practice. A Case Series of Complex Wounds Manage by Dedicated Orthopedic Nursing Team
Authors: Siti Rahayu, Khairulniza Mohd Puat, Kesavan R., Mohammad Harris A., Jalila, Kunalan G., Fazir Mohamad
Abstract:
The greatest challenge has been in establishing and maintaining the dedicated nursing team. Continuity is served when nurses are assigned exclusively for managing wound, where they can continue to build expertise and skills. In addition, there is a growing incidence of chronic wounds and recognition of the complexity involved in caring for these patients. We would like to share 4 cases with different techniques of wound management. 1st case, 39 years old gentleman with underlying rheumatoid arthritis with chronic periprosthetic joint infection of right total knee replacement presented with persistent drainage over right knee. Patient was consulted for two stage revision total knee replacement. However, patient only agreed for debridement and retention of implant. After debridement, large medial and lateral wound was treated with Instillation Negative Pressure Wound Therapy Dressings. After several cycle, the wound size reduced, and conventional dressing was applied. 2nd case, 58 years old gentleman with underlying diabetes presented with right foot necrotizing fasciitis with gangrene of 5th toe. He underwent extensive debridement of foot with rays’ amputation of 5th toe. Post debridement patient was started on Instillation Negative Pressure Wound Therapy Dressings. After several cycle of VAC, the wound bed was prepared, and he underwent split skin graft over right foot. 3 rd case, 60 years old gentleman with underlying diabetes mellitus presented with right foot necrotizing soft tissue infection. He underwent rays’ amputation and extensive wound debridement. Upon stabilization of general condition, patient was discharge with regular wound dressing by same nurse and doctor during each visit to clinic follow up. After 6 months of follow up, the wound healed well. 4th case, 38-year-old gentleman had alleged motor vehicle accident and sustained closed fracture right tibial plateau. Open reduction and proximal tibial locking plate were done. At 2 weeks post-surgery, the patient presented with warm, erythematous leg and pus discharge from the surgical site. Empirical antibiotic was started, and wound debridement was done. Intraoperatively, 50cc pus was evacuated, unhealthy muscle and tissue debrided. No loosening of the implant. Patient underwent multiple wound debridement. At 2 weeks post debridement wound healed well, but the proximal aspect was unable to close immediately. This left the proximal part of the implant to be exposed. Patient was then put on VAC dressing for 3 weeks until healthy granulation tissue closes the implant. Meanwhile, antibiotic was change according to culture and sensitivity. At 6 weeks post the first debridement, the wound was completely close, and patient was discharge home well. At 3 months post operatively, patient wound and fracture healed uneventfully and able to ambulate independently. Complex wounds are too serious to be dealt with. Team managing complex wound need continuous support through the provision of educational tools to support their professional development, engagement with local and international expert, as well as highquality products that increase efficiencies in servicesKeywords: VAC (Vacuum Assisted Closure), empirical- initial antibiotics, NPWT- negative pressure wound therapy, NF- necrotizing fasciitis, gangrene- blackish discoloration due to poor blood supply
Procedia PDF Downloads 10312 A Comparative Evaluation of Cognitive Load Management: Case Study of Postgraduate Business Students
Authors: Kavita Goel, Donald Winchester
Abstract:
In a world of information overload and work complexities, academics often struggle to create an online instructional environment enabling efficient and effective student learning. Research has established that students’ learning styles are different, some learn faster when taught using audio and visual methods. Attributes like prior knowledge and mental effort affect their learning. ‘Cognitive load theory’, opines learners have limited processing capacity. Cognitive load depends on the learner’s prior knowledge, the complexity of content and tasks, and instructional environment. Hence, the proper allocation of cognitive resources is critical for students’ learning. Consequently, a lecturer needs to understand the limits and strengths of the human learning processes, various learning styles of students, and accommodate these requirements while designing online assessments. As acknowledged in the cognitive load theory literature, visual and auditory explanations of worked examples potentially lead to a reduction of cognitive load (effort) and increased facilitation of learning when compared to conventional sequential text problem solving. This will help learner to utilize both subcomponents of their working memory. Instructional design changes were introduced at the case site for the delivery of the postgraduate business subjects. To make effective use of auditory and visual modalities, video recorded lectures, and key concept webinars were delivered to students. Videos were prepared to free up student limited working memory from irrelevant mental effort as all elements in a visual screening can be viewed simultaneously, processed quickly, and facilitates greater psychological processing efficiency. Most case study students in the postgraduate programs are adults, working full-time at higher management levels, and studying part-time. Their learning style and needs are different from other tertiary students. The purpose of the audio and visual interventions was to lower the students cognitive load and provide an online environment supportive to their efficient learning. These changes were expected to impact the student’s learning experience, their academic performance and retention favourably. This paper posits that these changes to instruction design facilitates students to integrate new knowledge into their long-term memory. A mixed methods case study methodology was used in this investigation. Primary data were collected from interviews and survey(s) of students and academics. Secondary data were collected from the organisation’s databases and reports. Some evidence was found that the academic performance of students does improve when new instructional design changes are introduced although not statistically significant. However, the overall grade distribution of student’s academic performance has changed and skewed higher which shows deeper understanding of the content. It was identified from feedback received from students that recorded webinars served as better learning aids than material with text alone, especially with more complex content. The recorded webinars on the subject content and assessments provides flexibility to students to access this material any time from repositories, many times, and this enhances students learning style. Visual and audio information enters student’s working memory more effectively. Also as each assessment included the application of the concepts, conceptual knowledge interacted with the pre-existing schema in the long-term memory and lowered student’s cognitive load.Keywords: cognitive load theory, learning style, instructional environment, working memory
Procedia PDF Downloads 14211 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 24210 Family Photos as Catalysts for Writing: A Pedagogical Exercise in Visual Analysis with MA Students
Authors: Susana Barreto
Abstract:
This paper explores a pedagogical exercise that employs family photos as catalysts for teaching visual analysis and inspiring academic writing among MA students. The study aimed to achieve two primary objectives: to impart students with the skills of analyzing images or artifacts and to ignite their writing for research purposes. Conducted at Viana Polytechnic in Portugal, the exercise involved two classes on Arts Management and Art Education Master course comprising approximately twenty students from diverse academic backgrounds, including Economics, Design, Fine Arts, and Sociology, among others. The exploratory exercise involved selecting an old family photo, analyzing its content and context, and deconstructing the chosen images in an intuitive and systematic manner. Students were encouraged to engage in photo elicitation, seeking insights from family/friends to gain multigenerational perspectives on the images. The feedback received from this exercise was consistently positive, largely due to the personal connection students felt with the objects of analysis. Family photos, with their emotional significance, fostered deeper engagement and motivation in the learning process. Furthermore, visual analysing family photos stimulated critical thinking as students interpreted the composition, subject matter, and potential meanings embedded in the images. This practice enhanced their ability to comprehend complex visual representations and construct compelling visual narratives, thereby facilitating the writing process. The exercise also facilitated the identification of patterns, similarities, and differences by comparing different family photos, leading to a more comprehensive analysis of visual elements and themes. Throughout the exercise, students found analyzing their own photographs both enjoyable and insightful. They progressed through preliminary analysis, explored content and context, and artfully interwove these components. Additionally, students experimented with various techniques such as converting photos to black and white, altering framing angles, and adjusting sizes to unveil hidden meanings.The methodology employed included observation, documental analysis of written reports, and student interviews. By including students from diverse academic backgrounds, the study enhanced its external validity, enabling a broader range of perspectives and insights during the exercise. Furthermore, encouraging students to seek multigenerational perspectives from family and friends added depth to the analysis, enriching the learning experience and broadening the understanding of the cultural and historical context associated with the family photos Highlighting the emotional significance of these family photos and the personal connection students felt with the objects of analysis fosters a deeper connection to the subject matter. Moreover, the emphasis on stimulating critical thinking through the analysis of composition, subject matter, and potential meanings in family photos suggests a targeted approach to developing analytical skills. This improvement focuses specifically on critical thinking and visual analysis, enhancing the overall quality of the exercise. Additionally, the inclusion of a step where students compare different family photos to identify patterns, similarities, and differences further enhances the depth of the analysis. This comparative approach adds a layer of complexity to the exercise, ultimately leading to a more comprehensive understanding of visual elements and themes. The expected results of this study will culminate in a set of practical recommendations for implementing this exercise in academic settings.Keywords: visual analysis, academic writing, pedagogical exercise, family photos
Procedia PDF Downloads 599 The Development, Composition, and Implementation of Vocalises as a Method of Technical Training for the Adult Musical Theatre Singer
Authors: Casey Keenan Joiner, Shayna Tayloe
Abstract:
Classical voice training for the novice singer has long relied on the guidance and instruction of vocalise collections, such as those written and compiled by Marchesi, Lütgen, Vaccai, and Lamperti. These vocalise collections purport to encourage healthy vocal habits and instill technical longevity in both aspiring and established singers, though their scope has long been somewhat confined to the classical idiom. For pedagogues and students specializing in other vocal genres, such as musical theatre and CCM (contemporary commercial music,) low-impact and pertinent vocal training aids are in short supply, and much of the suggested literature derives from classical methodology. While the tenants of healthy vocal production remain ubiquitous, specific stylistic needs and technical emphases differ from genre to genre and may require a specified extension of vocal acuity. As musical theatre continues to grow in popularity at both the professional and collegiate levels, the need for specialized training grows as well. Pedagogical literature geared specifically towards musical theatre (MT) singing and vocal production, while relatively uncommon, is readily accessible to the contemporary educator. Practitioners such as Norman Spivey, Mary Saunders Barton, Claudia Friedlander, Wendy Leborgne, and Marci Rosenberg continue to publish relevant research in the field of musical theatre voice pedagogy and have successfully identified many common MT vocal faults, their subsequent diagnoses, and their eventual corrections. Where classical methodology would suggest specific vocalises or training exercises to maintain corrected vocal posture following successful fault diagnosis, musical theatre finds itself without a relevant body of work towards which to transition. By analyzing the existing vocalise literature by means of a specialized set of parameters, including but not limited to melodic variation, rhythmic complexity, vowel utilization, and technical targeting, we have composed a set of vocalises meant specifically to address the training and conditioning of adult musical theatre voices. These vocalises target many pedagogical tenants in the musical theatre genre, including but not limited to thyroarytenoid-dominant production, twang resonance, lateral vowel formation, and “belt-mix.” By implementing these vocalises in the musical theatre voice studio, pedagogues can efficiently communicate proper musical theatre vocal posture and kinesthetic connection to their students, regardless of age or level of experience. The composition of these vocalises serves MT pedagogues on both a technical level as well as a sociological one. MT is a relative newcomer on the collegiate stage and the academization of musical theatre methodologies has been a slow and arduous process. The conflation of classical and MT techniques and training methods has long plagued the world of voice pedagogy and teachers often find themselves in positions of “cross-training,” that is, teaching students of both genres in one combined voice studio. As MT continues to establish itself on academic platforms worldwide, genre-specific literature and focused studies are both rare and invaluable. To ensure that modern students receive exacting and definitive training in their chosen fields, it becomes increasingly necessary for genres such as musical theatre to boast specified literature and a collection of musical theatre-specific vocalises only aids in this effort. This collection of musical theatre vocalises is the first of its kind and provides genre-specific studios with a basis upon which to grow healthy, balanced voices built for the harsh conditions of the modern theatre stage.Keywords: voice pedagogy, targeted methodology, musical theatre, singing
Procedia PDF Downloads 1558 The Impact of the Macro-Level: Organizational Communication in Undergraduate Medical Education
Authors: Julie M. Novak, Simone K. Brennan, Lacey Brim
Abstract:
Undergraduate medical education (UME) curriculum notably addresses micro-level communications (e.g., patient-provider, intercultural, inter-professional), yet frequently under-examines the role and impact of organizational communication, a more macro-level. Organizational communication, however, functions as foundation and through systemic structures of an organization and thereby serves as hidden curriculum and influences learning experiences and outcomes. Yet, little available research exists fully examining how students experience organizational communication while in medical school. Extant literature and best practices provide insufficient guidance for UME programs, in particular. The purpose of this study was to map and examine current organizational communication systems and processes in a UME program. Employing a phenomenology-grounded and participatory approach, this study sought to understand the organizational communication system from medical students' perspective. The research team consisted of a core team and 13 medical student co-investigators. This research employed multiple methods, including focus groups, individual interviews, and two surveys (one reflective of focus group questions, the other requesting students to submit ‘examples’ of communications). To provide context for student responses, nonstudent participants (faculty, administrators, and staff) were sampled, as they too express concerns about communication. Over 400 students across all cohorts and 17 nonstudents participated. Data were iteratively analyzed and checked for triangulation. Findings reveal the complex nature of organizational communication and student-oriented communications. They reveal program-impactful strengths, weaknesses, gaps, and tensions and speak to the role of organizational communication practices influencing both climate and culture. With regard to communications, students receive multiple, simultaneous communications from multiple sources/channels, both formal (e.g., official email) and informal (e.g., social media). Students identified organizational strengths including the desire to improve student voice, and message frequency. They also identified weaknesses related to over-reliance on emails, numerous platforms with inconsistent utilization, incorrect information, insufficient transparency, assessment/input fatigue, tacit expectations, scheduling/deadlines, responsiveness, and mental health confidentiality concerns. Moreover, they noted gaps related to lack of coordination/organization, ambiguous point-persons, student ‘voice-only’, open communication loops, lack of core centralization and consistency, and mental health bridges. Findings also revealed organizational identity and cultural characteristics as impactful on the medical school experience. Cultural characteristics included program size, diversity, urban setting, student organizations, community-engagement, crisis framing, learning for exams, inefficient bureaucracy, and professionalism. Moreover, they identified system structures that do not always leverage cultural strengths or reduce cultural problematics. Based on the results, opportunities for productive change are identified. These include leadership visibly supporting and enacting overall organizational narratives, making greater efforts in consistently ‘closing the loop’, regularly sharing how student input effects change, employing strategies of crisis communication more often, strengthening communication infrastructure, ensuring structures facilitate effective operations and change efforts, and highlighting change efforts in informational communication. Organizational communication and communications are not soft-skills, or of secondary concern within organizations, rather they are foundational in nature and serve to educate/inform all stakeholders. As primary stakeholders, students and their success directly affect the accomplishment of organizational goals. This study demonstrates how inquiries about how students navigate their educational experience extends research-based knowledge and provides actionable knowledge for the improvement of organizational operations in UME.Keywords: medical education programs, organizational communication, participatory research, qualitative mixed methods
Procedia PDF Downloads 1127 Successful Optimization of a Shallow Marginal Offshore Field and Its Applications
Authors: Kumar Satyam Das, Murali Raghunathan
Abstract:
This note discusses the feasibility of field development of a challenging shallow offshore field in South East Asia and how its learnings can be applied to marginal field development across the world especially developing marginal fields in this low oil price world. The field was found to be economically challenging even during high oil prices and the project was put on hold. Shell started development study with the aim to significantly reduce cost through competitively scoping and revive stranded projects. The proposed strategy to achieve this involved Improve Per platform recovery and Reduction in CAPEX. Methodology: Based on various Benchmarking Tool such as Woodmac for similar projects in the region and economic affordability, a challenging target of 50% reduction in unit development cost (UDC) was set for the project. Technical scope was defined to the minimum as to be a wellhead platform with minimum functionality to ensure production. The evaluation of key project decisions like Well location and number, well design, Artificial lift methods and wellhead platform type under different development concept was carried out through integrated multi-discipline approach. Key elements influencing per platform recovery were Wellhead Platform (WHP) location, Well count, well reach and well productivity. Major Findings: Reservoir being shallow posed challenges in well design (dog-leg severity, casing size and the achievable step-out), choice of artificial lift and sand-control method. Integrated approach amongst relevant disciplines with challenging mind-set enabled to achieve optimized set of development decisions. This led to significant improvement in per platform recovery. It was concluded that platform recovery largely depended on the reach of the well. Choice of slim well design enabled designing of high inclination and better productivity wells. However, there is trade-off between high inclination Gas Lift (GL) wells and low inclination wells in terms of long term value, operational complexity, well reach, recovery and uptime. Well design element like casing size, well completion, artificial lift and sand control were added successively over the minimum technical scope design leading to a value and risk staircase. Logical combinations of options (slim well, GL) were competitively screened to achieve 25% reduction in well cost. Facility cost reduction was achieved through sourcing standardized Low Cost Facilities platform in combination with portfolio execution to maximizing execution efficiency; this approach is expected to reduce facilities cost by ~23% with respect to the development costs. Further cost reductions were achieved by maximizing use of existing facilities nearby; changing reliance on existing water injection wells and utilizing existing water injector (W.I.) platform for new injectors. Conclusion: The study provides a spectrum of technically feasible options. It also made clear that different drivers lead to different development concepts and the cost value trade off staircase made this very visible. Scoping of the project through competitive way has proven to be valuable for decision makers by creating a transparent view of value and associated risks/uncertainty/trade-offs for difficult choices: elements of the projects can be competitive, whilst other parts will struggle, even though contributing to significant volumes. Reduction in UDC through proper scoping of present projects and its benchmarking paves as a learning for the development of marginal fields across the world, especially in this low oil price scenario. This way of developing a field has on average a reduction of 40% of cost for the Shell projects.Keywords: benchmarking, full field development, CAPEX, feasibility
Procedia PDF Downloads 1566 Prospects of Acellular Organ Scaffolds for Drug Discovery
Authors: Inna Kornienko, Svetlana Guryeva, Natalia Danilova, Elena Petersen
Abstract:
Drug toxicity often goes undetected until clinical trials, the most expensive and dangerous phase of drug development. Both human cell culture and animal studies have limitations that cannot be overcome by improvements in drug testing protocols. Tissue engineering is an emerging alternative approach to creating models of human malignant tumors for experimental oncology, personalized medicine, and drug discovery studies. This new generation of bioengineered tumors provides an opportunity to control and explore the role of every component of the model system including cell populations, supportive scaffolds, and signaling molecules. An area that could greatly benefit from these models is cancer research. Recent advances in tissue engineering demonstrated that decellularized tissue is an excellent scaffold for tissue engineering. Decellularization of donor organs such as heart, liver, and lung can provide an acellular, naturally occurring three-dimensional biologic scaffold material that can then be seeded with selected cell populations. Preliminary studies in animal models have provided encouraging results for the proof of concept. Decellularized Organs preserve organ microenvironment, which is critical for cancer metastasis. Utilizing 3D tumor models results greater proximity of cell culture morphological characteristics in a model to its in vivo counterpart, allows more accurate simulation of the processes within a functioning tumor and its pathogenesis. 3D models allow study of migration processes and cell proliferation with higher reliability as well. Moreover, cancer cells in a 3D model bear closer resemblance to living conditions in terms of gene expression, cell surface receptor expression, and signaling. 2D cell monolayers do not provide the geometrical and mechanical cues of tissues in vivo and are, therefore, not suitable to accurately predict the responses of living organisms. 3D models can provide several levels of complexity from simple monocultures of cancer cell lines in liquid environment comprised of oxygen and nutrient gradients and cell-cell interaction to more advanced models, which include co-culturing with other cell types, such as endothelial and immune cells. Following this reasoning, spheroids cultivated from one or multiple patient-derived cell lines can be utilized to seed the matrix rather than monolayer cells. This approach furthers the progress towards personalized medicine. As an initial step to create a new ex vivo tissue engineered model of a cancer tumor, optimized protocols have been designed to obtain organ-specific acellular matrices and evaluate their potential as tissue engineered scaffolds for cultures of normal and tumor cells. Decellularized biomatrix was prepared from animals’ kidneys, urethra, lungs, heart, and liver by two decellularization methods: perfusion in a bioreactor system and immersion-agitation on an orbital shaker with the use of various detergents (SDS, Triton X-100) in different concentrations and freezing. Acellular scaffolds and tissue engineered constructs have been characterized and compared using morphological methods. Models using decellularized matrix have certain advantages, such as maintaining native extracellular matrix properties and biomimetic microenvironment for cancer cells; compatibility with multiple cell types for cell culture and drug screening; utilization to culture patient-derived cells in vitro to evaluate different anticancer therapeutics for developing personalized medicines.Keywords: 3D models, decellularization, drug discovery, drug toxicity, scaffolds, spheroids, tissue engineering
Procedia PDF Downloads 2995 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases
Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo
Abstract:
The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis
Procedia PDF Downloads 2194 A Systematic Review Of Literature On The Importance Of Cultural Humility In Providing Optimal Palliative Care For All Persons
Authors: Roseanne Sharon Borromeo, Mariana Carvalho, Mariia Karizhenskaia
Abstract:
Healthcare providers need to comprehend cultural diversity for optimal patient-centered care, especially near the end of life. Although a universal method for navigating cultural differences would be ideal, culture’s high complexity makes this strategy impossible. Adding cultural humility, a process of self-reflection to understand personal and systemic biases and humbly acknowledging oneself as a learner when it comes to understanding another's experience leads to a meaningful process in palliative care generating respectful, honest, and trustworthy relationships. This study is a systematic review of the literature on cultural humility in palliative care research and best practices. Race, religion, language, values, and beliefs can affect an individual’s access to palliative care, underscoring the importance of culture in palliative care. Cultural influences affect end-of-life care perceptions, impacting bereavement rituals, decision-making, and attitudes toward death. Cultural factors affecting the delivery of care identified in a scoping review of Canadian literature include cultural competency, cultural sensitivity, and cultural accessibility. As the different parts of the world become exponentially diverse and multicultural, healthcare providers have been encouraged to give culturally competent care at the bedside. Therefore, many organizations have made cultural competence training required to expose professionals to the special needs and vulnerability of diverse populations. Cultural competence is easily standardized, taught, and implemented; however, this theoretically finite form of knowledge can dangerously lead to false assumptions or stereotyping, generating poor communication, loss of bonds and trust, and poor healthcare provider-patient relationship. In contrast, Cultural humility is a dynamic process that includes self-reflection, personal critique, and growth, allowing healthcare providers to respond to these differences with an open mind, curiosity, and awareness that one is never truly a “cultural” expert and requires life-long learning to overcome common biases and ingrained societal influences. Cultural humility concepts include self-awareness and power imbalances. While being culturally competent requires being skilled and knowledgeable in one’s culture, being culturally humble involves the sometimes-uncomfortable position of healthcare providers as students of the patient. Incorporating cultural humility emphasizes the need to approach end-of-life care with openness and responsiveness to various cultural perspectives. Thus, healthcare workers need to embrace lifelong learning in individual beliefs and values on suffering, death, and dying. There have been different approaches to this as well. Some adopt strategies for cultural humility, addressing conflicts and challenges through relational and health system approaches. In practice and research, clinicians and researchers must embrace cultural humility to advance palliative care practices, using qualitative methods to capture culturally nuanced experiences. Cultural diversity significantly impacts patient-centered care, particularly in end-of-life contexts. Cultural factors also shape end-of-life perceptions, impacting rituals, decision-making, and attitudes toward death. Cultural humility encourages openness and acknowledges the limitations of expertise in one’s culture. A consistent self-awareness and a desire to understand patients’ beliefs drive the practice of cultural humility. This dynamic process requires practitioners to learn continuously, fostering empathy and understanding. Cultural humility enhances palliative care, ensuring it resonates genuinely across cultural backgrounds and enriches patient-provider interactions.Keywords: cultural competency, cultural diversity, cultural humility, palliative care, self-awareness
Procedia PDF Downloads 613 Modelling Farmer’s Perception and Intention to Join Cashew Marketing Cooperatives: An Expanded Version of the Theory of Planned Behaviour
Authors: Gospel Iyioku, Jana Mazancova, Jiri Hejkrlik
Abstract:
The “Agricultural Promotion Policy (2016–2020)” represents a strategic initiative by the Nigerian government to address domestic food shortages and the challenges in exporting products at the required quality standards. Hindered by an inefficient system for setting and enforcing food quality standards, coupled with a lack of market knowledge, the Federal Ministry of Agriculture and Rural Development (FMARD) aims to enhance support for the production and activities of key crops like cashew. By collaborating with farmers, processors, investors, and stakeholders in the cashew sector, the policy seeks to define and uphold high-quality standards across the cashew value chain. Given the challenges and opportunities faced by Nigerian cashew farmers, active participation in cashew marketing groups becomes imperative. These groups serve as essential platforms for farmers to collectively navigate market intricacies, access resources, share knowledge, improve output quality, and bolster their overall bargaining power. Through engagement in these cooperative initiatives, farmers not only boost their economic prospects but can also contribute significantly to the sustainable growth of the cashew industry, fostering resilience and community development. This study explores the perceptions and intentions of farmers regarding their involvement in cashew marketing cooperatives, utilizing an expanded version of the Theory of Planned Behaviour. Drawing insights from a diverse sample of 321 cashew farmers in Southwest Nigeria, the research sheds light on the factors influencing decision-making in cooperative participation. The demographic analysis reveals a diverse landscape, with a substantial presence of middle-aged individuals contributing significantly to the agricultural sector and cashew-related activities emerging as a primary income source for a substantial proportion (23.99%). Employing Structural Equation Modelling (SEM) with Maximum Likelihood Robust (MLR) estimation in R, the research elucidates the associations among latent variables. Despite the model’s complexity, the goodness-of-fit indices attest to the validity of the structural model, explaining approximately 40% of the variance in the intention to join cooperatives. Moral norms emerge as a pivotal construct, highlighting the profound influence of ethical considerations in decision-making processes, while perceived behavioural control presents potential challenges in active participation. Attitudes toward joining cooperatives reveal nuanced perspectives, with strong beliefs in enhanced connections with other farmers but varying perceptions on improved access to essential information. The SEM analysis establishes positive and significant effects of moral norms, perceived behavioural control, subjective norms, and attitudes on farmers’ intention to join cooperatives. The knowledge construct positively affects key factors influencing intention, emphasizing the importance of informed decision-making. A supplementary analysis using partial least squares (PLS) SEM corroborates the robustness of our findings, aligning with covariance-based SEM results. This research unveils the determinants of cooperative participation and provides valuable insights for policymakers and practitioners aiming to empower and support this vital demographic in the cashew industry.Keywords: marketing cooperatives, theory of planned behaviour, structural equation modelling, cashew farmers
Procedia PDF Downloads 84