Search results for: Kolmogorov complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1665

Search results for: Kolmogorov complexity

15 Discovering Causal Structure from Observations: The Relationships between Technophile Attitude, Users Value and Use Intention of Mobility Management Travel App

Authors: Aliasghar Mehdizadeh Dastjerdi, Francisco Camara Pereira

Abstract:

The increasing complexity and demand of transport services strains transportation systems especially in urban areas with limited possibilities for building new infrastructure. The solution to this challenge requires changes of travel behavior. One of the proposed means to induce such change is multimodal travel apps. This paper describes a study of the intention to use a real-time multi-modal travel app aimed at motivating travel behavior change in the Greater Copenhagen Region (Denmark) toward promoting sustainable transport options. The proposed app is a multi-faceted smartphone app including both travel information and persuasive strategies such as health and environmental feedback, tailoring travel options, self-monitoring, tunneling users toward green behavior, social networking, nudging and gamification elements. The prospective for mobility management travel apps to stimulate sustainable mobility rests not only on the original and proper employment of the behavior change strategies, but also on explicitly anchoring it on established theoretical constructs from behavioral theories. The theoretical foundation is important because it positively and significantly influences the effectiveness of the system. However, there is a gap in current knowledge regarding the study of mobility-management travel app with support in behavioral theories, which should be explored further. This study addresses this gap by a social cognitive theory‐based examination. However, compare to conventional method in technology adoption research, this study adopts a reverse approach in which the associations between theoretical constructs are explored by Max-Min Hill-Climbing (MMHC) algorithm as a hybrid causal discovery method. A technology-use preference survey was designed to collect data. The survey elicited different groups of variables including (1) three groups of user’s motives for using the app including gain motives (e.g., saving travel time and cost), hedonic motives (e.g., enjoyment) and normative motives (e.g., less travel-related CO2 production), (2) technology-related self-concepts (i.e. technophile attitude) and (3) use Intention of the travel app. The questionnaire items led to the formulation of causal relationships discovery to learn the causal structure of the data. Causal relationships discovery from observational data is a critical challenge and it has applications in different research fields. The estimated causal structure shows that the two constructs of gain motives and technophilia have a causal effect on adoption intention. Likewise, there is a causal relationship from technophilia to both gain and hedonic motives. In line with the findings of the prior studies, it highlights the importance of functional value of the travel app as well as technology self-concept as two important variables for adoption intention. Furthermore, the results indicate the effect of technophile attitude on developing gain and hedonic motives. The causal structure shows hierarchical associations between the three groups of user’s motive. They can be explained by “frustration-regression” principle according to Alderfer's ERG (Existence, Relatedness and Growth) theory of needs meaning that a higher level need remains unfulfilled, a person may regress to lower level needs that appear easier to satisfy. To conclude, this study shows the capability of causal discovery methods to learn the causal structure of theoretical model, and accordingly interpret established associations.

Keywords: travel app, behavior change, persuasive technology, travel information, causality

Procedia PDF Downloads 112
14 Sustainable Urban Regenaration the New Vocabulary and the Timless Grammar of the Urban Tissue

Authors: Ruth Shapira

Abstract:

Introduction: The rapid urbanization of the last century confronts planners, regulatory bodies, developers and most of all the public with seemingly unsolved conflicts regarding values, capital, and wellbeing of the built and un-built urban space. There is an out of control change of scale of the urban form and of the rhythm of the urban life which has known no significant progress in the last 2-3 decades despite the on-growing urban population. It is the objective of this paper to analyze some of these fundamental issues through the case study of a relatively small town in the center of Israel (Kiryat-Ono, 36,000 inhabitants), unfold the deep structure of qualities versus disruptors, present some cure that we have developed to bridge over and humbly suggest a practice that may bring about a sustainable new urban environment based on timeless values of the past, an approach that can be generic for similar cases. Basic Methodologies:The object, the town of Kiryat Ono, shall be experimented upon in a series of four action processes: De-composition, Re-composition, the Centering process and, finally, Controlled Structural Disintegration. Each stage will be based on facts, analysis of previous multidisciplinary interventions on various layers – and the inevitable reaction of the OBJECT, leading to the conclusion based on innovative theoretical and practical methods that we have developed and that we believe are proper for the open ended network, setting the rules for the contemporary urban society to cluster by – thus – a new urban vocabulary based on the old structure of times passed. The Study: Kiryat Ono, was founded 70 years ago as an agricultural settlement and rapidly turned into an urban entity. In spite the massive intensification, the original DNA of the old small town was still deeply embedded, mostly in the quality of the public space and in the sense of clustered communities. In the past 20 years, the recent demand for housing has been addressed to on the national level with recent master plans and urban regeneration policies mostly encouraging individual economic initiatives. Unfortunately, due to the obsolete existing planning platform the present urban renewal is characterized by pressure of developers, a dramatic change in building scale and widespread disintegration of the existing urban and social tissue.Our office was commissioned to conceptualize two master plans for the two contradictory processes of Kiryat Ono’s future: intensification and conservation. Following a comprehensive investigation into the deep structures and qualities of the existing town, we developed a new vocabulary of conservation terms thus redefying the sense of PLACE. The main challenge was to create master plans that should offer a regulatory basis to the accelerated and sporadic development providing for the public good and preserving the characteristics of the place consisting of a tool box of design guidelines that will have the ability to reorganize space along the time axis in a sustainable way. In conclusion: The system of rules that we have developed can generate endless possible patterns making sure that at each implementation fragment an event is created, and a better place is revealed. It takes time and perseverance but it seems to be the way to provide a healthy and sustainable framework for the accelerated urbanization of our chaotic present.

Keywords: sustainable urban design, intensification, emergent urban patterns, sustainable housing, compact urban neighborhoods, sustainable regeneration, restoration, complexity, uncertainty, need for change, implications of legislation on local planning

Procedia PDF Downloads 367
13 Continuity Through Best Practice. A Case Series of Complex Wounds Manage by Dedicated Orthopedic Nursing Team

Authors: Siti Rahayu, Khairulniza Mohd Puat, Kesavan R., Mohammad Harris A., Jalila, Kunalan G., Fazir Mohamad

Abstract:

The greatest challenge has been in establishing and maintaining the dedicated nursing team. Continuity is served when nurses are assigned exclusively for managing wound, where they can continue to build expertise and skills. In addition, there is a growing incidence of chronic wounds and recognition of the complexity involved in caring for these patients. We would like to share 4 cases with different techniques of wound management. 1st case, 39 years old gentleman with underlying rheumatoid arthritis with chronic periprosthetic joint infection of right total knee replacement presented with persistent drainage over right knee. Patient was consulted for two stage revision total knee replacement. However, patient only agreed for debridement and retention of implant. After debridement, large medial and lateral wound was treated with Instillation Negative Pressure Wound Therapy Dressings. After several cycle, the wound size reduced, and conventional dressing was applied. 2nd case, 58 years old gentleman with underlying diabetes presented with right foot necrotizing fasciitis with gangrene of 5th toe. He underwent extensive debridement of foot with rays’ amputation of 5th toe. Post debridement patient was started on Instillation Negative Pressure Wound Therapy Dressings. After several cycle of VAC, the wound bed was prepared, and he underwent split skin graft over right foot. 3 rd case, 60 years old gentleman with underlying diabetes mellitus presented with right foot necrotizing soft tissue infection. He underwent rays’ amputation and extensive wound debridement. Upon stabilization of general condition, patient was discharge with regular wound dressing by same nurse and doctor during each visit to clinic follow up. After 6 months of follow up, the wound healed well. 4th case, 38-year-old gentleman had alleged motor vehicle accident and sustained closed fracture right tibial plateau. Open reduction and proximal tibial locking plate were done. At 2 weeks post-surgery, the patient presented with warm, erythematous leg and pus discharge from the surgical site. Empirical antibiotic was started, and wound debridement was done. Intraoperatively, 50cc pus was evacuated, unhealthy muscle and tissue debrided. No loosening of the implant. Patient underwent multiple wound debridement. At 2 weeks post debridement wound healed well, but the proximal aspect was unable to close immediately. This left the proximal part of the implant to be exposed. Patient was then put on VAC dressing for 3 weeks until healthy granulation tissue closes the implant. Meanwhile, antibiotic was change according to culture and sensitivity. At 6 weeks post the first debridement, the wound was completely close, and patient was discharge home well. At 3 months post operatively, patient wound and fracture healed uneventfully and able to ambulate independently. Complex wounds are too serious to be dealt with. Team managing complex wound need continuous support through the provision of educational tools to support their professional development, engagement with local and international expert, as well as highquality products that increase efficiencies in services

Keywords: VAC (Vacuum Assisted Closure), empirical- initial antibiotics, NPWT- negative pressure wound therapy, NF- necrotizing fasciitis, gangrene- blackish discoloration due to poor blood supply

Procedia PDF Downloads 81
12 A Comparative Evaluation of Cognitive Load Management: Case Study of Postgraduate Business Students

Authors: Kavita Goel, Donald Winchester

Abstract:

In a world of information overload and work complexities, academics often struggle to create an online instructional environment enabling efficient and effective student learning. Research has established that students’ learning styles are different, some learn faster when taught using audio and visual methods. Attributes like prior knowledge and mental effort affect their learning. ‘Cognitive load theory’, opines learners have limited processing capacity. Cognitive load depends on the learner’s prior knowledge, the complexity of content and tasks, and instructional environment. Hence, the proper allocation of cognitive resources is critical for students’ learning. Consequently, a lecturer needs to understand the limits and strengths of the human learning processes, various learning styles of students, and accommodate these requirements while designing online assessments. As acknowledged in the cognitive load theory literature, visual and auditory explanations of worked examples potentially lead to a reduction of cognitive load (effort) and increased facilitation of learning when compared to conventional sequential text problem solving. This will help learner to utilize both subcomponents of their working memory. Instructional design changes were introduced at the case site for the delivery of the postgraduate business subjects. To make effective use of auditory and visual modalities, video recorded lectures, and key concept webinars were delivered to students. Videos were prepared to free up student limited working memory from irrelevant mental effort as all elements in a visual screening can be viewed simultaneously, processed quickly, and facilitates greater psychological processing efficiency. Most case study students in the postgraduate programs are adults, working full-time at higher management levels, and studying part-time. Their learning style and needs are different from other tertiary students. The purpose of the audio and visual interventions was to lower the students cognitive load and provide an online environment supportive to their efficient learning. These changes were expected to impact the student’s learning experience, their academic performance and retention favourably. This paper posits that these changes to instruction design facilitates students to integrate new knowledge into their long-term memory. A mixed methods case study methodology was used in this investigation. Primary data were collected from interviews and survey(s) of students and academics. Secondary data were collected from the organisation’s databases and reports. Some evidence was found that the academic performance of students does improve when new instructional design changes are introduced although not statistically significant. However, the overall grade distribution of student’s academic performance has changed and skewed higher which shows deeper understanding of the content. It was identified from feedback received from students that recorded webinars served as better learning aids than material with text alone, especially with more complex content. The recorded webinars on the subject content and assessments provides flexibility to students to access this material any time from repositories, many times, and this enhances students learning style. Visual and audio information enters student’s working memory more effectively. Also as each assessment included the application of the concepts, conceptual knowledge interacted with the pre-existing schema in the long-term memory and lowered student’s cognitive load.

Keywords: cognitive load theory, learning style, instructional environment, working memory

Procedia PDF Downloads 118
11 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming

Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero

Abstract:

Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.

Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up

Procedia PDF Downloads 212
10 Family Photos as Catalysts for Writing: A Pedagogical Exercise in Visual Analysis with MA Students

Authors: Susana Barreto

Abstract:

This paper explores a pedagogical exercise that employs family photos as catalysts for teaching visual analysis and inspiring academic writing among MA students. The study aimed to achieve two primary objectives: to impart students with the skills of analyzing images or artifacts and to ignite their writing for research purposes. Conducted at Viana Polytechnic in Portugal, the exercise involved two classes on Arts Management and Art Education Master course comprising approximately twenty students from diverse academic backgrounds, including Economics, Design, Fine Arts, and Sociology, among others. The exploratory exercise involved selecting an old family photo, analyzing its content and context, and deconstructing the chosen images in an intuitive and systematic manner. Students were encouraged to engage in photo elicitation, seeking insights from family/friends to gain multigenerational perspectives on the images. The feedback received from this exercise was consistently positive, largely due to the personal connection students felt with the objects of analysis. Family photos, with their emotional significance, fostered deeper engagement and motivation in the learning process. Furthermore, visual analysing family photos stimulated critical thinking as students interpreted the composition, subject matter, and potential meanings embedded in the images. This practice enhanced their ability to comprehend complex visual representations and construct compelling visual narratives, thereby facilitating the writing process. The exercise also facilitated the identification of patterns, similarities, and differences by comparing different family photos, leading to a more comprehensive analysis of visual elements and themes. Throughout the exercise, students found analyzing their own photographs both enjoyable and insightful. They progressed through preliminary analysis, explored content and context, and artfully interwove these components. Additionally, students experimented with various techniques such as converting photos to black and white, altering framing angles, and adjusting sizes to unveil hidden meanings.The methodology employed included observation, documental analysis of written reports, and student interviews. By including students from diverse academic backgrounds, the study enhanced its external validity, enabling a broader range of perspectives and insights during the exercise. Furthermore, encouraging students to seek multigenerational perspectives from family and friends added depth to the analysis, enriching the learning experience and broadening the understanding of the cultural and historical context associated with the family photos Highlighting the emotional significance of these family photos and the personal connection students felt with the objects of analysis fosters a deeper connection to the subject matter. Moreover, the emphasis on stimulating critical thinking through the analysis of composition, subject matter, and potential meanings in family photos suggests a targeted approach to developing analytical skills. This improvement focuses specifically on critical thinking and visual analysis, enhancing the overall quality of the exercise. Additionally, the inclusion of a step where students compare different family photos to identify patterns, similarities, and differences further enhances the depth of the analysis. This comparative approach adds a layer of complexity to the exercise, ultimately leading to a more comprehensive understanding of visual elements and themes. The expected results of this study will culminate in a set of practical recommendations for implementing this exercise in academic settings.

Keywords: visual analysis, academic writing, pedagogical exercise, family photos

Procedia PDF Downloads 32
9 The Development, Composition, and Implementation of Vocalises as a Method of Technical Training for the Adult Musical Theatre Singer

Authors: Casey Keenan Joiner, Shayna Tayloe

Abstract:

Classical voice training for the novice singer has long relied on the guidance and instruction of vocalise collections, such as those written and compiled by Marchesi, Lütgen, Vaccai, and Lamperti. These vocalise collections purport to encourage healthy vocal habits and instill technical longevity in both aspiring and established singers, though their scope has long been somewhat confined to the classical idiom. For pedagogues and students specializing in other vocal genres, such as musical theatre and CCM (contemporary commercial music,) low-impact and pertinent vocal training aids are in short supply, and much of the suggested literature derives from classical methodology. While the tenants of healthy vocal production remain ubiquitous, specific stylistic needs and technical emphases differ from genre to genre and may require a specified extension of vocal acuity. As musical theatre continues to grow in popularity at both the professional and collegiate levels, the need for specialized training grows as well. Pedagogical literature geared specifically towards musical theatre (MT) singing and vocal production, while relatively uncommon, is readily accessible to the contemporary educator. Practitioners such as Norman Spivey, Mary Saunders Barton, Claudia Friedlander, Wendy Leborgne, and Marci Rosenberg continue to publish relevant research in the field of musical theatre voice pedagogy and have successfully identified many common MT vocal faults, their subsequent diagnoses, and their eventual corrections. Where classical methodology would suggest specific vocalises or training exercises to maintain corrected vocal posture following successful fault diagnosis, musical theatre finds itself without a relevant body of work towards which to transition. By analyzing the existing vocalise literature by means of a specialized set of parameters, including but not limited to melodic variation, rhythmic complexity, vowel utilization, and technical targeting, we have composed a set of vocalises meant specifically to address the training and conditioning of adult musical theatre voices. These vocalises target many pedagogical tenants in the musical theatre genre, including but not limited to thyroarytenoid-dominant production, twang resonance, lateral vowel formation, and “belt-mix.” By implementing these vocalises in the musical theatre voice studio, pedagogues can efficiently communicate proper musical theatre vocal posture and kinesthetic connection to their students, regardless of age or level of experience. The composition of these vocalises serves MT pedagogues on both a technical level as well as a sociological one. MT is a relative newcomer on the collegiate stage and the academization of musical theatre methodologies has been a slow and arduous process. The conflation of classical and MT techniques and training methods has long plagued the world of voice pedagogy and teachers often find themselves in positions of “cross-training,” that is, teaching students of both genres in one combined voice studio. As MT continues to establish itself on academic platforms worldwide, genre-specific literature and focused studies are both rare and invaluable. To ensure that modern students receive exacting and definitive training in their chosen fields, it becomes increasingly necessary for genres such as musical theatre to boast specified literature and a collection of musical theatre-specific vocalises only aids in this effort. This collection of musical theatre vocalises is the first of its kind and provides genre-specific studios with a basis upon which to grow healthy, balanced voices built for the harsh conditions of the modern theatre stage.

Keywords: voice pedagogy, targeted methodology, musical theatre, singing

Procedia PDF Downloads 127
8 Successful Optimization of a Shallow Marginal Offshore Field and Its Applications

Authors: Kumar Satyam Das, Murali Raghunathan

Abstract:

This note discusses the feasibility of field development of a challenging shallow offshore field in South East Asia and how its learnings can be applied to marginal field development across the world especially developing marginal fields in this low oil price world. The field was found to be economically challenging even during high oil prices and the project was put on hold. Shell started development study with the aim to significantly reduce cost through competitively scoping and revive stranded projects. The proposed strategy to achieve this involved Improve Per platform recovery and Reduction in CAPEX. Methodology: Based on various Benchmarking Tool such as Woodmac for similar projects in the region and economic affordability, a challenging target of 50% reduction in unit development cost (UDC) was set for the project. Technical scope was defined to the minimum as to be a wellhead platform with minimum functionality to ensure production. The evaluation of key project decisions like Well location and number, well design, Artificial lift methods and wellhead platform type under different development concept was carried out through integrated multi-discipline approach. Key elements influencing per platform recovery were Wellhead Platform (WHP) location, Well count, well reach and well productivity. Major Findings: Reservoir being shallow posed challenges in well design (dog-leg severity, casing size and the achievable step-out), choice of artificial lift and sand-control method. Integrated approach amongst relevant disciplines with challenging mind-set enabled to achieve optimized set of development decisions. This led to significant improvement in per platform recovery. It was concluded that platform recovery largely depended on the reach of the well. Choice of slim well design enabled designing of high inclination and better productivity wells. However, there is trade-off between high inclination Gas Lift (GL) wells and low inclination wells in terms of long term value, operational complexity, well reach, recovery and uptime. Well design element like casing size, well completion, artificial lift and sand control were added successively over the minimum technical scope design leading to a value and risk staircase. Logical combinations of options (slim well, GL) were competitively screened to achieve 25% reduction in well cost. Facility cost reduction was achieved through sourcing standardized Low Cost Facilities platform in combination with portfolio execution to maximizing execution efficiency; this approach is expected to reduce facilities cost by ~23% with respect to the development costs. Further cost reductions were achieved by maximizing use of existing facilities nearby; changing reliance on existing water injection wells and utilizing existing water injector (W.I.) platform for new injectors. Conclusion: The study provides a spectrum of technically feasible options. It also made clear that different drivers lead to different development concepts and the cost value trade off staircase made this very visible. Scoping of the project through competitive way has proven to be valuable for decision makers by creating a transparent view of value and associated risks/uncertainty/trade-offs for difficult choices: elements of the projects can be competitive, whilst other parts will struggle, even though contributing to significant volumes. Reduction in UDC through proper scoping of present projects and its benchmarking paves as a learning for the development of marginal fields across the world, especially in this low oil price scenario. This way of developing a field has on average a reduction of 40% of cost for the Shell projects.

Keywords: benchmarking, full field development, CAPEX, feasibility

Procedia PDF Downloads 127
7 Prospects of Acellular Organ Scaffolds for Drug Discovery

Authors: Inna Kornienko, Svetlana Guryeva, Natalia Danilova, Elena Petersen

Abstract:

Drug toxicity often goes undetected until clinical trials, the most expensive and dangerous phase of drug development. Both human cell culture and animal studies have limitations that cannot be overcome by improvements in drug testing protocols. Tissue engineering is an emerging alternative approach to creating models of human malignant tumors for experimental oncology, personalized medicine, and drug discovery studies. This new generation of bioengineered tumors provides an opportunity to control and explore the role of every component of the model system including cell populations, supportive scaffolds, and signaling molecules. An area that could greatly benefit from these models is cancer research. Recent advances in tissue engineering demonstrated that decellularized tissue is an excellent scaffold for tissue engineering. Decellularization of donor organs such as heart, liver, and lung can provide an acellular, naturally occurring three-dimensional biologic scaffold material that can then be seeded with selected cell populations. Preliminary studies in animal models have provided encouraging results for the proof of concept. Decellularized Organs preserve organ microenvironment, which is critical for cancer metastasis. Utilizing 3D tumor models results greater proximity of cell culture morphological characteristics in a model to its in vivo counterpart, allows more accurate simulation of the processes within a functioning tumor and its pathogenesis. 3D models allow study of migration processes and cell proliferation with higher reliability as well. Moreover, cancer cells in a 3D model bear closer resemblance to living conditions in terms of gene expression, cell surface receptor expression, and signaling. 2D cell monolayers do not provide the geometrical and mechanical cues of tissues in vivo and are, therefore, not suitable to accurately predict the responses of living organisms. 3D models can provide several levels of complexity from simple monocultures of cancer cell lines in liquid environment comprised of oxygen and nutrient gradients and cell-cell interaction to more advanced models, which include co-culturing with other cell types, such as endothelial and immune cells. Following this reasoning, spheroids cultivated from one or multiple patient-derived cell lines can be utilized to seed the matrix rather than monolayer cells. This approach furthers the progress towards personalized medicine. As an initial step to create a new ex vivo tissue engineered model of a cancer tumor, optimized protocols have been designed to obtain organ-specific acellular matrices and evaluate their potential as tissue engineered scaffolds for cultures of normal and tumor cells. Decellularized biomatrix was prepared from animals’ kidneys, urethra, lungs, heart, and liver by two decellularization methods: perfusion in a bioreactor system and immersion-agitation on an orbital shaker with the use of various detergents (SDS, Triton X-100) in different concentrations and freezing. Acellular scaffolds and tissue engineered constructs have been characterized and compared using morphological methods. Models using decellularized matrix have certain advantages, such as maintaining native extracellular matrix properties and biomimetic microenvironment for cancer cells; compatibility with multiple cell types for cell culture and drug screening; utilization to culture patient-derived cells in vitro to evaluate different anticancer therapeutics for developing personalized medicines.

Keywords: 3D models, decellularization, drug discovery, drug toxicity, scaffolds, spheroids, tissue engineering

Procedia PDF Downloads 270
6 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases

Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo

Abstract:

The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.

Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis

Procedia PDF Downloads 185
5 A Systematic Review Of Literature On The Importance Of Cultural Humility In Providing Optimal Palliative Care For All Persons

Authors: Roseanne Sharon Borromeo, Mariana Carvalho, Mariia Karizhenskaia

Abstract:

Healthcare providers need to comprehend cultural diversity for optimal patient-centered care, especially near the end of life. Although a universal method for navigating cultural differences would be ideal, culture’s high complexity makes this strategy impossible. Adding cultural humility, a process of self-reflection to understand personal and systemic biases and humbly acknowledging oneself as a learner when it comes to understanding another's experience leads to a meaningful process in palliative care generating respectful, honest, and trustworthy relationships. This study is a systematic review of the literature on cultural humility in palliative care research and best practices. Race, religion, language, values, and beliefs can affect an individual’s access to palliative care, underscoring the importance of culture in palliative care. Cultural influences affect end-of-life care perceptions, impacting bereavement rituals, decision-making, and attitudes toward death. Cultural factors affecting the delivery of care identified in a scoping review of Canadian literature include cultural competency, cultural sensitivity, and cultural accessibility. As the different parts of the world become exponentially diverse and multicultural, healthcare providers have been encouraged to give culturally competent care at the bedside. Therefore, many organizations have made cultural competence training required to expose professionals to the special needs and vulnerability of diverse populations. Cultural competence is easily standardized, taught, and implemented; however, this theoretically finite form of knowledge can dangerously lead to false assumptions or stereotyping, generating poor communication, loss of bonds and trust, and poor healthcare provider-patient relationship. In contrast, Cultural humility is a dynamic process that includes self-reflection, personal critique, and growth, allowing healthcare providers to respond to these differences with an open mind, curiosity, and awareness that one is never truly a “cultural” expert and requires life-long learning to overcome common biases and ingrained societal influences. Cultural humility concepts include self-awareness and power imbalances. While being culturally competent requires being skilled and knowledgeable in one’s culture, being culturally humble involves the sometimes-uncomfortable position of healthcare providers as students of the patient. Incorporating cultural humility emphasizes the need to approach end-of-life care with openness and responsiveness to various cultural perspectives. Thus, healthcare workers need to embrace lifelong learning in individual beliefs and values on suffering, death, and dying. There have been different approaches to this as well. Some adopt strategies for cultural humility, addressing conflicts and challenges through relational and health system approaches. In practice and research, clinicians and researchers must embrace cultural humility to advance palliative care practices, using qualitative methods to capture culturally nuanced experiences. Cultural diversity significantly impacts patient-centered care, particularly in end-of-life contexts. Cultural factors also shape end-of-life perceptions, impacting rituals, decision-making, and attitudes toward death. Cultural humility encourages openness and acknowledges the limitations of expertise in one’s culture. A consistent self-awareness and a desire to understand patients’ beliefs drive the practice of cultural humility. This dynamic process requires practitioners to learn continuously, fostering empathy and understanding. Cultural humility enhances palliative care, ensuring it resonates genuinely across cultural backgrounds and enriches patient-provider interactions.

Keywords: cultural competency, cultural diversity, cultural humility, palliative care, self-awareness

Procedia PDF Downloads 35
4 Experiences and Perceptions of the Barriers and Facilitators of Continence Care Provision in Residential and Nursing Homes for Older Adults: A Systematic Evidence Synthesis and Qualitative Exploration

Authors: Jennifer Wheeldon, Nick de Viggiani, Nikki Cotterill

Abstract:

Background: Urinary and fecal incontinence affect a significant proportion of older adults aged 65 and over who permanently reside in residential and nursing home facilities. Incontinence symptoms have been linked to comorbidities, an increased risk of infection and reduced quality of life and mental wellbeing of residents. However, continence care provision can often be poor, further compromising the health and wellbeing of this vulnerable population. Objectives: To identify experiences and perceptions of continence care provision in older adult residential care settings and to identify factors that help or hinder good continence care provision. Settings included both residential care homes and nursing homes for older adults. Methods: A qualitative evidence synthesis using systematic review methodology established the current evidence-base. Data from 20 qualitative and mixed-method studies was appraised and synthesized. Following the review process, 10* qualitative interviews with staff working in older adult residential care settings were conducted across six* sites, which included registered managers, registered nurses and nursing/care assistants/aides. Purposive sampling recruited individuals from across England. Both evidence synthesis and interview data was analyzed thematically, both manually and with NVivo software. Results: The evidence synthesis revealed complex barriers and facilitators for continence care provision at three influencing levels: macro (structural and societal external influences), meso (organizational and institutional influences) and micro (day-to-day actions of individuals impacting service delivery). Macro-level barriers included negative stigmas relating to incontinence, aging and working in the older adult social care sector, restriction of continence care resources such as containment products (i.e. pads), short staffing in care facilities, shortfalls in the professional education and training of care home staff and the complex health and social care needs of older adult residents. Meso-level barriers included task-centered organizational cultures, ageist institutional perspectives regarding old age and incontinence symptoms, inadequate care home management and poor communication and teamwork among care staff. Micro-level barriers included poor knowledge and negative attitudes of care home staff and residents regarding incontinence symptoms and symptom management and treatment. Facilitators at the micro-level included proactive and inclusive leadership skills of individuals in management roles. Conclusions: The findings of the evidence synthesis study help to outline the complexities of continence care provision in older adult care homes facilities. Macro, meso and micro level influences demonstrate problematic and interrelated barriers across international contexts, indicating that improving continence care in this setting is extremely challenging due to the multiple levels at which care provision and services are impacted. Both international and national older adult social care policy-makers, researchers and service providers must recognize this complexity, and any intervention seeking to improve continence care in older adult care home settings must be planned accordingly and appreciatively of the complex and interrelated influences. It is anticipated that the findings of the qualitative interviews will shed further light on the national context of continence care provision specific to England; data collection is ongoing*. * Sample size is envisaged to be between 20-30 participants from multiple sites by Spring 2023.

Keywords: continence care, residential and nursing homes, evidence synthesis, qualitative

Procedia PDF Downloads 55
3 Modelling Farmer’s Perception and Intention to Join Cashew Marketing Cooperatives: An Expanded Version of the Theory of Planned Behaviour

Authors: Gospel Iyioku, Jana Mazancova, Jiri Hejkrlik

Abstract:

The “Agricultural Promotion Policy (2016–2020)” represents a strategic initiative by the Nigerian government to address domestic food shortages and the challenges in exporting products at the required quality standards. Hindered by an inefficient system for setting and enforcing food quality standards, coupled with a lack of market knowledge, the Federal Ministry of Agriculture and Rural Development (FMARD) aims to enhance support for the production and activities of key crops like cashew. By collaborating with farmers, processors, investors, and stakeholders in the cashew sector, the policy seeks to define and uphold high-quality standards across the cashew value chain. Given the challenges and opportunities faced by Nigerian cashew farmers, active participation in cashew marketing groups becomes imperative. These groups serve as essential platforms for farmers to collectively navigate market intricacies, access resources, share knowledge, improve output quality, and bolster their overall bargaining power. Through engagement in these cooperative initiatives, farmers not only boost their economic prospects but can also contribute significantly to the sustainable growth of the cashew industry, fostering resilience and community development. This study explores the perceptions and intentions of farmers regarding their involvement in cashew marketing cooperatives, utilizing an expanded version of the Theory of Planned Behaviour. Drawing insights from a diverse sample of 321 cashew farmers in Southwest Nigeria, the research sheds light on the factors influencing decision-making in cooperative participation. The demographic analysis reveals a diverse landscape, with a substantial presence of middle-aged individuals contributing significantly to the agricultural sector and cashew-related activities emerging as a primary income source for a substantial proportion (23.99%). Employing Structural Equation Modelling (SEM) with Maximum Likelihood Robust (MLR) estimation in R, the research elucidates the associations among latent variables. Despite the model’s complexity, the goodness-of-fit indices attest to the validity of the structural model, explaining approximately 40% of the variance in the intention to join cooperatives. Moral norms emerge as a pivotal construct, highlighting the profound influence of ethical considerations in decision-making processes, while perceived behavioural control presents potential challenges in active participation. Attitudes toward joining cooperatives reveal nuanced perspectives, with strong beliefs in enhanced connections with other farmers but varying perceptions on improved access to essential information. The SEM analysis establishes positive and significant effects of moral norms, perceived behavioural control, subjective norms, and attitudes on farmers’ intention to join cooperatives. The knowledge construct positively affects key factors influencing intention, emphasizing the importance of informed decision-making. A supplementary analysis using partial least squares (PLS) SEM corroborates the robustness of our findings, aligning with covariance-based SEM results. This research unveils the determinants of cooperative participation and provides valuable insights for policymakers and practitioners aiming to empower and support this vital demographic in the cashew industry.

Keywords: marketing cooperatives, theory of planned behaviour, structural equation modelling, cashew farmers

Procedia PDF Downloads 34
2 Full Characterization of Heterogeneous Antibody Samples under Denaturing and Native Conditions on a Hybrid Quadrupole-Orbitrap Mass Spectrometer

Authors: Rowan Moore, Kai Scheffler, Eugen Damoc, Jennifer Sutton, Aaron Bailey, Stephane Houel, Simon Cubbon, Jonathan Josephs

Abstract:

Purpose: MS analysis of monoclonal antibodies (mAbs) at the protein and peptide levels is critical during development and production of biopharmaceuticals. The compositions of current generation therapeutic proteins are often complex due to various modifications which may affect efficacy. Intact proteins analyzed by MS are detected in higher charge states that also provide more complexity in mass spectra. Protein analysis in native or native-like conditions with zero or minimal organic solvent and neutral or weakly acidic pH decreases charge state value resulting in mAb detection at higher m/z ranges with more spatial resolution. Methods: Three commercially available mAbs were used for all experiments. Intact proteins were desalted online using size exclusion chromatography (SEC) or reversed phase chromatography coupled on-line with a mass spectrometer. For streamlined use of the LC- MS platform we used a single SEC column and alternately selected specific mobile phases to perform separations in either denaturing or native-like conditions: buffer A (20 % ACN, 0.1 % FA) with Buffer B (100 mM ammonium acetate). For peptide analysis mAbs were proteolytically digested with and without prior reduction and alkylation. The mass spectrometer used for all experiments was a commercially available Thermo Scientific™ hybrid Quadrupole-Orbitrap™ mass spectrometer, equipped with the new BioPharma option which includes a new High Mass Range (HMR) mode that allows for improved high mass transmission and mass detection up to 8000 m/z. Results: We have analyzed the profiles of three mAbs under reducing and native conditions by direct infusion with offline desalting and with on-line desalting via size exclusion and reversed phase type columns. The presence of high salt under denaturing conditions was found to influence the observed charge state envelope and impact mass accuracy after spectral deconvolution. The significantly lower charge states observed under native conditions improves the spatial resolution of protein signals and has significant benefits for the analysis of antibody mixtures, e.g. lysine variants, degradants or sequence variants. This type of analysis requires the detection of masses beyond the standard mass range ranging up to 6000 m/z requiring the extended capabilities available in the new HMR mode. We have compared each antibody sample that was analyzed individually with mixtures in various relative concentrations. For this type of analysis, we observed that apparent native structures persist and ESI is benefited by the addition of low amounts of acetonitrile and formic acid in combination with the ammonium acetate-buffered mobile phase. For analyses on the peptide level we analyzed reduced/alkylated, and non-reduced proteolytic digests of the individual antibodies separated via reversed phase chromatography aiming to retrieve as much information as possible regarding sequence coverage, disulfide bridges, post-translational modifications such as various glycans, sequence variants, and their relative quantification. All data acquired were submitted to a single software package for analysis aiming to obtain a complete picture of the molecules analyzed. Here we demonstrate the capabilities of the mass spectrometer to fully characterize homogeneous and heterogeneous therapeutic proteins on one single platform. Conclusion: Full characterization of heterogeneous intact protein mixtures by improved mass separation on a quadrupole-Orbitrap™ mass spectrometer with extended capabilities has been demonstrated.

Keywords: disulfide bond analysis, intact analysis, native analysis, mass spectrometry, monoclonal antibodies, peptide mapping, post-translational modifications, sequence variants, size exclusion chromatography, therapeutic protein analysis, UHPLC

Procedia PDF Downloads 336
1 The Road Ahead: Merging Human Cyber Security Expertise with Generative AI

Authors: Brennan Lodge

Abstract:

Cybersecurity professionals have long been embroiled in a digital arms race, confronting increasingly sophisticated threats with innovative solutions. The field of cybersecurity is in an unending race against malicious adversaries. As threats evolve in complexity, the tools used to defend against them need to advance even faster. Burdened with a vast arsenal of tools and an expansive scope of threat intelligence, analysts frequently navigate a complex web, trying to discern patterns amidst information overload. Herein lies the potential of Retrieval Augmented Generation (RAG). By combining the capabilities of Large Language Models (LLMs) with a generative AI facet, RAG brings to the table an unparalleled ability for real-time cross-referencing, bridging the gap between raw data and actionable insights. Imagine an analyst named Sarah working at a global Fortune 500 company. Every day, Sarah navigates a maze of diverse knowledge bases, real-time threat intelligence, and her company's vast proprietary data, from network specifics to intricate technical blueprints. One day, she's challenged by a potential breach through a personal device due to the company's global "Bring Your Own Device" policy. With the clock ticking, Sarah has mere minutes to trace the malware's origin, all while considering complex regional regulations. As she races against the benchmark of Mean Time To Resolution (MTTR), she wonders: Could "Cozy Bear" with its notorious malware tactic, HAMMERTOSS, be behind this? Balancing policy intricacies, global network considerations, and ever-emerging cyber threats, Sarah's role epitomizes the intense challenges faced by today's cybersecurity analysts. While analysts grapple with this array of intricate, time-sensitive challenges, the necessity for precision and efficiency is key. RAG technology—a cutting-edge advancement in Gen AI—is a promising solution. Designed to assimilate diverse data sources such as cyber advisory notices, phishing email sentiment, secure and insecure code examples, information security policy documentation, and the MITRE ATT&CK framework, RAG equips analysts with real-time querying capabilities through a vector database and a cross referenced concise response from a Gen AI model. Traditional relational databases often necessitate a tedious process of filtering through numerous entries. Now, with the synergy of vector databases and Gen AI models, analysts can rapidly access both contextually or semantically akin data points. This augmented approach equips analysts with a comprehensive understanding of the prevailing cyber threats, elevating the robustness of cybersecurity defenses and upskilling the analyst and team, too. Vector databases underpin the knowledge translation in Gen AI. They bridge the gap between raw data and translation into meaningful insights, ensuring that analysts are equipped with comprehensive and relevant information. This superior capability of the RAG framework, with its impressive depth and precision, finds application across a broad spectrum of cybersecurity challenges. Let's delve into some use cases where its potential becomes particularly evident: Phishing Email Sentiment Analysis: Phishing remains a predominant vector for cybersecurity breaches. Leveraging RAG's capabilities, analysts can not only assess the potential malevolence of an email but can also understand the context behind it. By cross-referencing patterns from varied data sources in real-time, the detection process evolves from a mere content evaluation to a holistic understanding of attacker tactics, behaviors, and evolving profiles. This allows for the identification of nuanced phishing strategies that might otherwise go undetected. Insecure Code Analysis: Software vulnerabilities form a critical entry point for cyber adversaries. With RAG, the process of code evaluation undergoes a transformation. Instead of manual code reviews, the system pulls insights from vector databases and historical code snippets marked as insecure, enabling detection of vulnerabilities based on historical patterns, emerging threat vectors, and even predictive threat modeling. This ensures that even the most obfuscated or embedded vulnerabilities are identified, and corrective measures can be promptly implemented. Vulnerability and Upskill Advisory: In the fast-paced world of cybersecurity, staying updated is paramount. Through RAG's capabilities, analysts are not only made aware of real-time vulnerabilities but are also guided on the necessary skills and tools needed to combat them. By dynamically sourcing data through vulnerability advisories, news on advanced persistent threats, and tactics to defend, RAG ensures that analysts are not only reactive to threats but are also proactively upskilled, thereby bolstering their defense mechanisms. Information Security Policies for Compliance Teams: Compliance remains at the heart of many organizational cybersecurity strategies. However, with ever-shifting regulatory landscapes, staying compliant becomes a moving target. RAG's ability to source real-time data ensures that compliance teams always have access to the latest policy changes, guidelines, and best practices. This not only facilitates adherence to current standards but also anticipates future shifts, assists with audits, and ensures that organizations remain ahead of the compliance curve. Fusing a RAG architecture with platforms like Slack amplifies its practical utility. Slack, known for its real-time communication prowess, seamlessly evolves into more than just a messaging platform in this context. Cybersecurity analysts can pose intricate queries within Slack and, almost instantaneously, receive comprehensive feedback powered by the harmonious interplay of RAG and Gen AI. This integration effectively transforms Slack into an AI-augmented chatbot-like assistant for cybersecurity professionals, always ready to provide informed insights on-demand, making it an indispensable ally in the ever-evolving cyber battlefield. Navigating the vast landscape of cybersecurity, analysts often encounter unfamiliar terminologies and techniques., analysts require tools that not only detect or inform them of threats, like CISA (U.S Cybersecurity Infrastructure Security Agency) Advisories, but also interpret and communicate them effectively. Consider a junior cybersecurity analyst named Alex, who comes across the term "Kerberoasting" while reviewing a network log. Unfamiliar with its intricacies, Alex turns to Slack to pose a query: "chat explain is Kerberoasting, using CISA." Almost instantaneously, Slack, powered by the harmonious interplay of RAG and Gen AI, provides a detailed response, cross-referencing a recent cyber advisory on the technique. It explains how attackers can exploit the Kerberos Ticket Granting Service to decipher service account passwords, potentially compromising a network. In this dynamic realm of cybersecurity, the blend of RAG and Generative AI represents more than just a technological leap. It embodies a paradigm shift, promising a future where human expertise and AI-driven precision join forces. As cyber threats continue their relentless advance, this synergy ensures that defenders are equipped with an arsenal that's not just reactive, but also profoundly insightful. No longer should analysts be submerged in a deluge of data without direction. Instead, they should be empowered, to discern, act, and preempt with unparalleled clarity and confidence. By harmoniously intertwining human discernment with AI capabilities, we should chart a path towards a future where cybersecurity is not just about defense, but about achieving a strategic advantage, paving the way for a safer, informed and a more secure digital horizon.

Keywords: cybersecurity, gen AI, retrieval augmented generation, cybersecurity defense strategies

Procedia PDF Downloads 47