Search results for: complicated diverticulitis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 790

Search results for: complicated diverticulitis

10 Electronic Raman Scattering Calibration for Quantitative Surface-Enhanced Raman Spectroscopy and Improved Biostatistical Analysis

Authors: Wonil Nam, Xiang Ren, Inyoung Kim, Masoud Agah, Wei Zhou

Abstract:

Despite its ultrasensitive detection capability, surface-enhanced Raman spectroscopy (SERS) faces challenges as a quantitative biochemical analysis tool due to the significant dependence of local field intensity in hotspots on nanoscale geometric variations of plasmonic nanostructures. Therefore, despite enormous progress in plasmonic nanoengineering of high-performance SERS devices, it is still challenging to quantitatively correlate the measured SERS signals with the actual molecule concentrations at hotspots. A significant effort has been devoted to developing SERS calibration methods by introducing internal standards. It has been achieved by placing Raman tags at plasmonic hotspots. Raman tags undergo similar SERS enhancement at the same hotspots, and ratiometric SERS signals for analytes of interest can be generated with reduced dependence on geometrical variations. However, using Raman tags still faces challenges for real-world applications, including spatial competition between the analyte and tags in hotspots, spectral interference, laser-induced degradation/desorption due to plasmon-enhanced photochemical/photothermal effects. We show that electronic Raman scattering (ERS) signals from metallic nanostructures at hotspots can serve as the internal calibration standard to enable quantitative SERS analysis and improve biostatistical analysis. We perform SERS with Au-SiO₂ multilayered metal-insulator-metal nano laminated plasmonic nanostructures. Since the ERS signal is proportional to the volume density of electron-hole occupation in hotspots, the ERS signals exponentially increase when the wavenumber is approaching the zero value. By a long-pass filter, generally used in backscattered SERS configurations, to chop the ERS background continuum, we can observe an ERS pseudo-peak, IERS. Both ERS and SERS processes experience the |E|⁴ local enhancements during the excitation and inelastic scattering transitions. We calibrated IMRS of 10 μM Rhodamine 6G in solution by IERS. The results show that ERS calibration generates a new analytical value, ISERS/IERS, insensitive to variations from different hotspots and thus can quantitatively reflect the molecular concentration information. Given the calibration capability of ERS signals, we performed label-free SERS analysis of living biological systems using four different breast normal and cancer cell lines cultured on nano-laminated SERS devices. 2D Raman mapping over 100 μm × 100 μm, containing several cells, was conducted. The SERS spectra were subsequently analyzed by multivariate analysis using partial least square discriminant analysis. Remarkably, after ERS calibration, MCF-10A and MCF-7 cells are further separated while the two triple-negative breast cancer cells (MDA-MB-231 and HCC-1806) are more overlapped, in good agreement with the well-known cancer categorization regarding the degree of malignancy. To assess the strength of ERS calibration, we further carried out a drug efficacy study using MDA-MB-231 and different concentrations of anti-cancer drug paclitaxel (PTX). After ERS calibration, we can more clearly segregate the control/low-dosage groups (0 and 1.5 nM), the middle-dosage group (5 nM), and the group treated with half-maximal inhibitory concentration (IC50, 15 nM). Therefore, we envision that ERS calibrated SERS can find crucial opportunities in label-free molecular profiling of complicated biological systems.

Keywords: cancer cell drug efficacy, plasmonics, surface-enhanced Raman spectroscopy (SERS), SERS calibration

Procedia PDF Downloads 138
9 Optimizing the Residential Design Process Using Automated Technologies and AI

Authors: Milena Nanova, Martin Georgiev, Radul Shishkov, Damyan Damov

Abstract:

Modern residential architecture is increasingly influenced by rapid urbanization, technological advancements, and growing investor expectations. The integration of AI and digital tools such as CAD and BIM (Building Information Modelling) are transforming the design process by improving efficiency, accuracy, and speed. However, urban development faces challenges, including the high competition for viable sites and the time-consuming nature of traditional investment feasibility studies and architectural planning. Finding and analysing suitable sites for residential development is complicated by intense competition and rising investor demands. Investors require quick assessments of property potential to avoid missing opportunities, while traditional architectural design processes are relying on experience of the team and can be time consuming, adding pressure to make fast, effective decisions. The widespread use of CAD tools has sped up the drafting process, enhancing both accuracy and efficiency. Digital tools allow designers to manipulate drawings quickly, reducing the time spent on revisions. BIM further advances this by enabling native 3D modelling, where changes to a design in one view are automatically reflected in all others, minimizing errors and saving time. AI is becoming an integral part of architectural design software. While AI is currently being incorporated into existing programs like AutoCAD, Revit, and ArchiCAD, its full potential is reached in parametric modelling. In this process, designers define parameters (e.g., building size, layout, and materials), and the software generates multiple design variations based on those inputs. This method accelerates the design process by automating decisions and enabling quick generation of alternative solutions. The study utilizes generative design, a specific application of parametric modelling which uses AI to explore a wide range of design possibilities based on predefined criteria. It optimizes designs through iterations, testing many variations to find the best solutions. This process is particularly beneficial in the early stages of design, where multiple options are explored before refining the best ones. AI’s ability to handle complex mathematical tasks allows it to generate unconventional yet effective designs that a human designer might overlook. Residential architecture, with its anticipated and typical layouts and modular nature, is especially suitable for generative design. The relationships between rooms and the overall organization of apartment units follow logical patterns, making it an ideal candidate for parametric modelling. Using these tools, architects can quickly explore various apartment configurations, considering factors like apartment sizes, types, and circulation patterns, and identify the most efficient layout for a given site. Parametric modelling and generative design offer significant benefits to residential architecture by streamlining the design process, enabling faster decision-making, and optimizing building layouts. These technologies allow architects and developers to analyse numerous design possibilities, improving outcomes while responding to the challenges of urban development. By integrating AI-driven generative design, the architecture industry can enhance creativity, efficiency, and adaptability in residential projects.

Keywords: architectural design, residential buildings, generative design, parametric models, workflow optimization

Procedia PDF Downloads 2
8 Braille Lab: A New Design Approach for Social Entrepreneurship and Innovation in Assistive Tools for the Visually Impaired

Authors: Claudio Loconsole, Daniele Leonardis, Antonio Brunetti, Gianpaolo Francesco Trotta, Nicholas Caporusso, Vitoantonio Bevilacqua

Abstract:

Unfortunately, many people still do not have access to communication, with specific regard to reading and writing. Among them, people who are blind or visually impaired, have several difficulties in getting access to the world, compared to the sighted. Indeed, despite technology advancement and cost reduction, nowadays assistive devices are still expensive such as Braille-based input/output systems which enable reading and writing texts (e.g., personal notes, documents). As a consequence, assistive technology affordability is fundamental in supporting the visually impaired in communication, learning, and social inclusion. This, in turn, has serious consequences in terms of equal access to opportunities, freedom of expression, and actual and independent participation to a society designed for the sighted. Moreover, the visually impaired experience difficulties in recognizing objects and interacting with devices in any activities of daily living. It is not a case that Braille indications are commonly reported only on medicine boxes and elevator keypads. Several software applications for the automatic translation of written text into speech (e.g., Text-To-Speech - TTS) enable reading pieces of documents. However, apart from simple tasks, in many circumstances TTS software is not suitable for understanding very complicated pieces of text requiring to dwell more on specific portions (e.g., mathematical formulas or Greek text). In addition, the experience of reading\writing text is completely different both in terms of engagement, and from an educational perspective. Statistics on the employment rate of blind people show that learning to read and write provides the visually impaired with up to 80% more opportunities of finding a job. Especially in higher educational levels, where the ability to digest very complex text is key, accessibility and availability of Braille plays a fundamental role in reducing drop-out rate of the visually impaired, thus affecting the effectiveness of the constitutional right to get access to education. In this context, the Braille Lab project aims at overcoming these social needs by including affordability in designing and developing assistive tools for visually impaired people. In detail, our awarded project focuses on a technology innovation of the operation principle of existing assistive tools for the visually impaired leaving the Human-Machine Interface unchanged. This can result in a significant reduction of the production costs and consequently of tool selling prices, thus representing an important opportunity for social entrepreneurship. The first two assistive tools designed within the Braille Lab project following the proposed approach aims to provide the possibility to personally print documents and handouts and to read texts written in Braille using refreshable Braille display, respectively. The former, named ‘Braille Cartridge’, represents an alternative solution for printing in Braille and consists in the realization of an electronic-controlled dispenser printing (cartridge) which can be integrated within traditional ink-jet printers, in order to leverage the efficiency and cost of the device mechanical structure which are already being used. The latter, named ‘Braille Cursor’, is an innovative Braille display featuring a substantial technology innovation by means of a unique cursor virtualizing Braille cells, thus limiting the number of active pins needed for Braille characters.

Keywords: Human rights, social challenges and technology innovations, visually impaired, affordability, assistive tools

Procedia PDF Downloads 275
7 Effect of Black Cumin (Nigella sativa) Extract on Damaged Brain Cells

Authors: Batul Kagalwala

Abstract:

The nervous system is made up of complex delicate structures such as the spinal cord, peripheral nerves and the brain. These are prone to various types of injury ranging from neurodegenerative diseases to trauma leading to diseases like Parkinson's, Alzheimer's, multiple sclerosis, amyotrophic lateral sclerosis (ALS), multiple system atrophy etc. Unfortunately, because of the complicated structure of nervous system, spontaneous regeneration, repair and healing is seldom seen due to which brain damage, peripheral nerve damage and paralysis from spinal cord injury are often permanent and incapacitating. Hence, innovative and standardized approach is required for advance treatment of neurological injury. Nigella sativa (N. sativa), an annual flowering plant native to regions of southern Europe and Asia; has been suggested to have neuroprotective and anti-seizures properties. Neuroregeneration is found to occur in damaged cells when treated using extract of N. sativa. Due to its proven health benefits, lots of experiments are being conducted to extract all the benefits from the plant. The flowers are delicate and are usually pale blue and white in color with small black seeds. These seeds are the source of active components such as 30–40% fixed oils, 0.5–1.5% essential oils, pharmacologically active components containing thymoquinone (TQ), ditimoquinone (DTQ) and nigellin. In traditional medicine, this herb was identified to have healing properties and was extensively used Middle East and Far East for treating diseases such as head ache, back pain, asthma, infections, dysentery, hypertension, obesity and gastrointestinal problems. Literature studies have confirmed the extract of N. sativa seeds and TQ have inhibitory effects on inducible nitric oxide synthase and production of nitric oxide as well as anti-inflammatory and anticancer activities. Experimental investigation will be conducted to understand which ingredient of N. sativa causes neuroregeneration and roots to its healing property. An aqueous/ alcoholic extract of N. sativa will be made. Seed oil is also found to have used by researchers to prepare such extracts. For the alcoholic extracts, the seeds need to be powdered and soaked in alcohol for a period of time and the alcohol must be evaporated using rotary evaporator. For aqueous extracts, the powder must be dissolved in distilled water to obtain a pure extract. The mobile phase will be the extract while the suitable stationary phase (substance that is a good adsorbent e.g. silica gels, alumina, cellulose etc.) will be selected. Different ingredients of N. sativa will be separated using High Performance Liquid Chromatography (HPLC) for treating damaged cells. Damaged brain cells will be treated individually and in different combinations of 2 or 3 compounds for different intervals of time. The most suitable compound or a combination of compounds for the regeneration of cells will be determined using DOE methodology. Later the gene will also be determined and using Polymerase Chain Reaction (PCR) it will be replicated in a plasmid vector. This plasmid vector shall be inserted in the brain of the organism used and replicated within. The gene insertion can also be done by the gene gun method. The gene in question can be coated on a micro bullet of tungsten and bombarded in the area of interest and gene replication and coding shall be studied. Investigation on whether the gene replicates in the organism or not will be examined.

Keywords: black cumin, brain cells, damage, extract, neuroregeneration, PCR, plasmids, vectors

Procedia PDF Downloads 661
6 Application of Large Eddy Simulation-Immersed Boundary Volume Penalization Method for Heat and Mass Transfer in Granular Layers

Authors: Artur Tyliszczak, Ewa Szymanek, Maciej Marek

Abstract:

Flow through granular materials is important to a vast array of industries, for instance in construction industry where granular layers are used for bulkheads and isolators, in chemical engineering and catalytic reactors where large surfaces of packed granular beds intensify chemical reactions, or in energy production systems, where granulates are promising materials for heat storage and heat transfer media. Despite the common usage of granulates and extensive research performed in this field, phenomena occurring between granular solid elements or between solids and fluid are still not fully understood. In the present work we analyze the heat exchange process between the flowing medium (gas, liquid) and solid material inside the granular layers. We consider them as a composite of isolated solid elements and inter-granular spaces in which a gas or liquid can flow. The structure of the layer is controlled by shapes of particular granular elements (e.g., spheres, cylinders, cubes, Raschig rings), its spatial distribution or effective characteristic dimension (total volume or surface area). We will analyze to what extent alteration of these parameters influences on flow characteristics (turbulent intensity, mixing efficiency, heat transfer) inside the layer and behind it. Analysis of flow inside granular layers is very complicated because the use of classical experimental techniques (LDA, PIV, fibber probes) inside the layers is practically impossible, whereas the use of probes (e.g. thermocouples, Pitot tubes) requires drilling of holes inside the solid material. Hence, measurements of the flow inside granular layers are usually performed using for instance advanced X-ray tomography. In this respect, theoretical or numerical analyses of flow inside granulates seem crucial. Application of discrete element methods in combination with the classical finite volume/finite difference approaches is problematic as a mesh generation process for complex granular material can be very arduous. A good alternative for simulation of flow in complex domains is an immersed boundary-volume penalization (IB-VP) in which the computational meshes have simple Cartesian structure and impact of solid objects on the fluid is mimicked by source terms added to the Navier-Stokes and energy equations. The present paper focuses on application of the IB-VP method combined with large eddy simulation (LES). The flow solver used in this work is a high-order code (SAILOR), which was used previously in various studies, including laminar/turbulent transition in free flows and also for flows in wavy channels, wavy pipes and over various shape obstacles. In these cases a formal order of approximation turned out to be in between 1 and 2, depending on the test case. The current research concentrates on analyses of the flows in dense granular layers with elements distributed in a deterministic regular manner and validation of the results obtained using LES-IB method and body-fitted approach. The comparisons are very promising and show very good agreement. It is found that the size, number of elements and their distribution have huge impact on the obtained results. Ordering of the granular elements (or lack of it) affects both the pressure drop and efficiency of the heat transfer as it significantly changes mixing process.

Keywords: granular layers, heat transfer, immersed boundary method, numerical simulations

Procedia PDF Downloads 138
5 Experimental Proof of Concept for Piezoelectric Flow Harvesting for In-Pipe Metering Systems

Authors: Sherif Keddis, Rafik Mitry, Norbert Schwesinger

Abstract:

Intelligent networking of devices has rapidly been gaining importance over the past years and with recent advances in the fields of microcontrollers, integrated circuits and wireless communication, low power applications have emerged, enabling this trend even more. Connected devices provide a much larger database thus enabling highly intelligent and accurate systems. Ensuring safe drinking water is one of the fields that require constant monitoring and can benefit from an increased accuracy. Monitoring is mainly achieved either through complex measures, such as collecting samples from the points of use, or through metering systems typically distant to the points of use which deliver less accurate assessments of the quality of water. Constant metering near the points of use is complicated due to their inaccessibility; e.g. buried water pipes, locked spaces, which makes system maintenance extremely difficult and often unviable. The research presented here attempts to overcome this challenge by providing these systems with enough energy through a flow harvester inside the pipe thus eliminating the maintenance requirements in terms of battery replacements or containment of leakage resulting from wiring such systems. The proposed flow harvester exploits the piezoelectric properties of polyvinylidene difluoride (PVDF) films to convert turbulence induced oscillations into electrical energy. It is intended to be used in standard water pipes with diameters between 0.5 and 1 inch. The working principle of the harvester uses a ring shaped bluff body inside the pipe to induce pressure fluctuations. Additionally the bluff body houses electronic components such as storage, circuitry and RF-unit. Placing the piezoelectric films downstream of that bluff body causes their oscillation which generates electrical charge. The PVDF-film is placed as a multilayered wrap fixed to the pipe wall leaving the top part to oscillate freely inside the flow. The warp, which allows for a larger active, consists of two layers of 30µm thick and 12mm wide PVDF layered alternately with two centered 6µm thick and 8mm wide aluminum foil electrodes. The length of the layers depends on the number of windings and is part of the investigation. Sealing the harvester against liquid penetration is achieved by wrapping it in a ring-shaped LDPE-film and welding the open ends. The fabrication of the PVDF-wraps is done by hand. After validating the working principle using a wind tunnel, experiments have been conducted in water, placing the harvester inside a 1 inch pipe at water velocities of 0.74m/s. To find a suitable placement of the wrap inside the pipe, two forms of fixation were compared regarding their power output. Further investigations regarding the number of windings required for efficient transduction were made. Best results were achieved using a wrap with 3 windings of the active layers which delivers a constant power output of 0.53µW at a 2.3MΩ load and an effective voltage of 1.1V. Considering the extremely low power requirements of sensor applications, these initial results are promising. For further investigations and optimization, machine designs are currently being developed to automate the fabrication and decrease tolerance of the prototypes.

Keywords: maintenance-free sensors, measurements at point of use, piezoelectric flow harvesting, universal micro generator, wireless metering systems

Procedia PDF Downloads 193
4 Computational, Human, and Material Modalities: An Augmented Reality Workflow for Building form Found Textile Structures

Authors: James Forren

Abstract:

This research paper details a recent demonstrator project in which digital form found textile structures were built by human craftspersons wearing augmented reality (AR) head-worn displays (HWDs). The project utilized a wet-state natural fiber / cementitious matrix composite to generate minimal bending shapes in tension which, when cured and rotated, performed as minimal-bending compression members. The significance of the project is that it synthesizes computational structural simulations with visually guided handcraft production. Computational and physical form-finding methods with textiles are well characterized in the development of architectural form. One difficulty, however, is physically building computer simulations: often requiring complicated digital fabrication workflows. However, AR HWDs have been used to build a complex digital form from bricks, wood, plastic, and steel without digital fabrication devices. These projects utilize, instead, the tacit knowledge motor schema of the human craftsperson. Computational simulations offer unprecedented speed and performance in solving complex structural problems. Human craftspersons possess highly efficient complex spatial reasoning motor schemas. And textiles offer efficient form-generating possibilities for individual structural members and overall structural forms. This project proposes that the synthesis of these three modalities of structural problem-solving – computational, human, and material - may not only develop efficient structural form but offer further creative potentialities when the respective intelligence of each modality is productively leveraged. The project methodology pertains to its three modalities of production: 1) computational, 2) human, and 3) material. A proprietary three-dimensional graphic statics simulator generated a three-legged arch as a wireframe model. This wireframe was discretized into nine modules, three modules per leg. Each module was modeled as a woven matrix of one-inch diameter chords. And each woven matrix was transmitted to a holographic engine running on HWDs. Craftspersons wearing the HWDs then wove wet cementitious chords within a simple falsework frame to match the minimal bending form displayed in front of them. Once the woven components cured, they were demounted from the frame. The components were then assembled into a full structure using the holographically displayed computational model as a guide. The assembled structure was approximately eighteen feet in diameter and ten feet in height and matched the holographic model to under an inch of tolerance. The construction validated the computational simulation of the minimal bending form as it was dimensionally stable for a ten-day period, after which it was disassembled. The demonstrator illustrated the facility with which computationally derived, a structurally stable form could be achieved by the holographically guided, complex three-dimensional motor schema of the human craftsperson. However, the workflow traveled unidirectionally from computer to human to material: failing to fully leverage the intelligence of each modality. Subsequent research – a workshop testing human interaction with a physics engine simulation of string networks; and research on the use of HWDs to capture hand gestures in weaving seeks to develop further interactivity with rope and chord towards a bi-directional workflow within full-scale building environments.

Keywords: augmented reality, cementitious composites, computational form finding, textile structures

Procedia PDF Downloads 176
3 Settings of Conditions Leading to Reproducible and Robust Biofilm Formation in vitro in Evaluation of Drug Activity against Staphylococcal Biofilms

Authors: Adela Diepoltova, Klara Konecna, Ondrej Jandourek, Petr Nachtigal

Abstract:

A loss of control over antibiotic-resistant pathogens has become a global issue due to severe and often untreatable infections. This state is reflected in complicated treatment, health costs, and higher mortality. All these factors emphasize the urgent need for the discovery and development of new anti-infectives. One of the most common pathogens mentioned in the phenomenon of antibiotic resistance are bacteria of the genus Staphylococcus. These bacterial agents have developed several mechanisms against the effect of antibiotics. One of them is biofilm formation. In staphylococci, biofilms are associated with infections such as endocarditis, osteomyelitis, catheter-related bloodstream infections, etc. To author's best knowledge, no validated and standardized methodology evaluating candidate compound activity against staphylococcal biofilms exists. However, a variety of protocols for in vitro drug activity testing has been suggested, yet there are often fundamental differences. Based on our experience, a key methodological step that leads to credible results is to form a robust biofilm with appropriate attributes such as firm adherence to the substrate, a complex arrangement in layers, and the presence of extracellular polysaccharide matrix. At first, for the purpose of drug antibiofilm activity evaluation, the focus was put on various conditions (supplementation of cultivation media by human plasma/fetal bovine serum, shaking mode, the density of initial inoculum) that should lead to reproducible and robust in vitro staphylococcal biofilm formation in microtiter plate model. Three model staphylococcal reference strains were included in the study: Staphylococcus aureus (ATCC 29213), methicillin-resistant Staphylococcus aureus (ATCC 43300), and Staphylococcus epidermidis (ATCC 35983). The total biofilm biomass was quantified using the Christensen method with crystal violet, and results obtained from at least three independent experiments were statistically processed. Attention was also paid to the viability of the biofilm-forming staphylococcal cells and the presence of extracellular polysaccharide matrix. The conditions that led to robust biofilm biomass formation with attributes for biofilms mentioned above were then applied by introducing an alternative method analogous to the commercially available test system, the Calgary Biofilm Device. In this test system, biofilms are formed on pegs that are incorporated into the lid of the microtiter plate. This system provides several advantages (in situ detection and quantification of biofilm microbial cells that have retained their viability after drug exposure). Based on our preliminary studies, it was found that the attention to the peg surface and substrate on which the bacterial biofilms are formed should also be paid to. Therefore, further steps leading to the optimization were introduced. The surface of pegs was coated by human plasma, fetal bovine serum, and L-polylysine. Subsequently, the willingness of bacteria to adhere and form biofilm was monitored. In conclusion, suitable conditions were revealed, leading to the formation of reproducible, robust staphylococcal biofilms in vitro for the microtiter model and the system analogous to the Calgary biofilm device, as well. The robustness and typical slime texture could be detected visually. Likewise, an analysis by confocal laser scanning microscopy revealed a complex three-dimensional arrangement of biofilm forming organisms surrounded by an extracellular polysaccharide matrix.

Keywords: anti-biofilm drug activity screening, in vitro biofilm formation, microtiter plate model, the Calgary biofilm device, staphylococcal infections, substrate modification, surface coating

Procedia PDF Downloads 156
2 Translating the Australian National Health and Medical Research Council Obesity Guidelines into Practice into a Rural/Regional Setting in Tasmania, Australia

Authors: Giuliana Murfet, Heidi Behrens

Abstract:

Chronic disease is Australia’s biggest health concern and obesity the leading risk factor for many. Obesity and chronic disease have a higher representation in rural Tasmania, where levels of socio-disadvantage are also higher. People living outside major cities have less access to health services and poorer health outcomes. To help primary healthcare professionals manage obesity, the Australian NHMRC evidence-based clinical practice guidelines for management of overweight and obesity in adults were developed. They include recommendations for practice and models for obesity management. To our knowledge there has been no research conducted that investigates translation of these guidelines into practice in rural-regional areas; where implementation can be complicated by limited financial and staffing resources. Also, the systematic review that informed the guidelines revealed a lack of evidence for chronic disease models of obesity care. The aim was to establish and evaluate a multidisciplinary model for obesity management in a group of adult people with type 2 diabetes in a dispersed rural population in Australia. Extensive stakeholder engagement was undertaken to both garner support for an obesity clinic and develop a sustainable model of care. A comprehensive nurse practitioner-led outpatient model for obesity care was designed. Multidisciplinary obesity clinics for adults with type 2 diabetes including a dietitian, psychologist, physiotherapist and nurse practitioner were set up in the north-west of Tasmania at two geographically-rural towns. Implementation was underpinned by the NHMRC guidelines and recommendations focused on: assessment approaches; promotion of health benefits of weight loss; identification of relevant programs for individualising care; medication and bariatric surgery options for obesity management; and, the importance of long-term weight management. A clinical pathway for adult weight management is delivered by the multidisciplinary team with recognition of the impact of and adjustments needed for other comorbidities. The model allowed for intensification of intervention such as bariatric surgery according to recommendations, patient desires and suitability. A randomised controlled trial is ongoing, with the aim to evaluate standard care (diabetes-focused management) compared with an obesity-related approach with additional dietetic, physiotherapy, psychology and lifestyle advice. Key barriers and enablers to guideline implementation were identified that fall under the following themes: 1) health care delivery changes and the project framework development; 2) capacity and team-building; 3) stakeholder engagement; and, 4) the research project and partnerships. Engagement of not only local hospital but also state-wide health executives and surgical services committee were paramount to the success of the project. Staff training and collective development of the framework allowed for shared understanding. Staff capacity was increased with most taking on other activities (e.g., surgery coordination). Barriers were often related to differences of opinions in focus of the project; a desire to remain evidenced based (e.g., exercise prescription) without adjusting the model to allow for consideration of comorbidities. While barriers did exist and challenges overcome; the development of critical partnerships did enable the capacity for a potential model of obesity care for rural regional areas. Importantly, the findings contribute to the evidence base for models of diabetes and obesity care that coordinate limited resources.

Keywords: diabetes, interdisciplinary, model of care, obesity, rural regional

Procedia PDF Downloads 229
1 Evaluation of Academic Research Projects Using the AHP and TOPSIS Methods

Authors: Murat Arıbaş, Uğur Özcan

Abstract:

Due to the increasing number of universities and academics, the fund of the universities for research activities and grants/supports given by government institutions have increased number and quality of academic research projects. Although every academic research project has a specific purpose and importance, limited resources (money, time, manpower etc.) require choosing the best ones from all (Amiri, 2010). It is a pretty hard process to compare and determine which project is better such that the projects serve different purposes. In addition, the evaluation process has become complicated since there are more than one evaluator and multiple criteria for the evaluation (Dodangeh, Mojahed and Yusuff, 2009). Mehrez and Sinuany-Stern (1983) determined project selection problem as a Multi Criteria Decision Making (MCDM) problem. If a decision problem involves multiple criteria and objectives, it is called as a Multi Attribute Decision Making problem (Ömürbek & Kınay, 2013). There are many MCDM methods in the literature for the solution of such problems. These methods are AHP (Analytic Hierarchy Process), ANP (Analytic Network Process), TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluation), UTADIS (Utilities Additives Discriminantes), ELECTRE (Elimination et Choix Traduisant la Realite), MAUT (Multiattribute Utility Theory), GRA (Grey Relational Analysis) etc. Teach method has some advantages compared with others (Ömürbek, Blacksmith & Akalın, 2013). Hence, to decide which MCDM method will be used for solution of the problem, factors like the nature of the problem, types of choices, measurement scales, type of uncertainty, dependency among the attributes, expectations of decision maker, and quantity and quality of the data should be considered (Tavana & Hatami-Marbini, 2011). By this study, it is aimed to develop a systematic decision process for the grant support applications that are expected to be evaluated according to their scientific adequacy by multiple evaluators under certain criteria. In this context, project evaluation process applied by The Scientific and Technological Research Council of Turkey (TÜBİTAK) the leading institutions in our country, was investigated. Firstly in the study, criteria that will be used on the project evaluation were decided. The main criteria were selected among TÜBİTAK evaluation criteria. These criteria were originality of project, methodology, project management/team and research opportunities and extensive impact of project. Moreover, for each main criteria, 2-4 sub criteria were defined, hence it was decided to evaluate projects over 13 sub-criterion in total. Due to superiority of determination criteria weights AHP method and provided opportunity ranking great number of alternatives TOPSIS method, they are used together. AHP method, developed by Saaty (1977), is based on selection by pairwise comparisons. Because of its simple structure and being easy to understand, AHP is the very popular method in the literature for determining criteria weights in MCDM problems. Besides, the TOPSIS method developed by Hwang and Yoon (1981) as a MCDM technique is an alternative to ELECTRE method and it is used in many areas. In the method, distance from each decision point to ideal and to negative ideal solution point was calculated by using Euclidian Distance Approach. In the study, main criteria and sub-criteria were compared on their own merits by using questionnaires that were developed based on an importance scale by four relative groups of people (i.e. TUBITAK specialists, TUBITAK managers, academics and individuals from business world ) After these pairwise comparisons, weight of the each main criteria and sub-criteria were calculated by using AHP method. Then these calculated criteria’ weights used as an input in TOPSİS method, a sample consisting 200 projects were ranked on their own merits. This new system supported to opportunity to get views of the people that take part of project process including preparation, evaluation and implementation on the evaluation of academic research projects. Moreover, instead of using four main criteria in equal weight to evaluate projects, by using weighted 13 sub-criteria and decision point’s distance from the ideal solution, systematic decision making process was developed. By this evaluation process, new approach was created to determine importance of academic research projects.

Keywords: Academic projects, Ahp method, Research projects evaluation, Topsis method.

Procedia PDF Downloads 591