Search results for: and teachers' interaction approaches
559 Sensory Interventions for Dementia: A Review
Authors: Leigh G. Hayden, Susan E. Shepley, Cristina Passarelli, William Tingo
Abstract:
Introduction: Sensory interventions are popular therapeutic and recreational approaches for people living with all stages of dementia. However, it is unknown which sensory interventions are used to achieve which outcomes across all subtypes of dementia. Methods: To address this gap, we conducted a scoping review of sensory interventions for people living with dementia. We conducted a search of the literature for any article published in English from 1 January 1990 to 1 June 2019, on any sensory or multisensory intervention targeted to people living with any kind of dementia, which reported on patient health outcomes. We did not include complex interventions where only a small aspect was related to sensory stimulation. We searched the databases Medline, CINHAL, and Psych Articles using our institutional discovery layer. We conducted all screening in duplicate to reduce Type 1 and Type 2 errors. The data from all included papers were extracted by one team member, and audited by another, to ensure consistency of extraction and completeness of data. Results: Our initial search captured 7654 articles, and the removal of duplicates (n=5329), those that didn’t pass title and abstract screening (n=1840) and those that didn’t pass full-text screening (n=281) resulted in 174 articles included. The countries with the highest publication in this area were the United States (n=59), the United Kingdom (n=26) and Australia (n=15). The most common type of interventions were music therapy (n=36), multisensory rooms (n=27) and multisensory therapies (n=25). Seven articles were published in the 1990’s, 55 in the 2000’s, and the remainder since 2010 (n=112). Discussion: Multisensory rooms have been present in the literature since the early 1990’s. However, more recently, nature/garden therapy, art therapy, and light therapy have emerged since 2008 in the literature, an indication of the increasingly diverse scholarship in the area. The least popular type of intervention is a traditional food intervention. Taste as a sensory intervention is generally avoided for safety reasons, however it shows potential for increasing quality of life. Agitation, behavior, and mood are common outcomes for all sensory interventions. However, light therapy commonly targets sleep. The majority (n=110) of studies have very small sample sizes (n=20 or less), an indicator of the lack of robust data in the field. Additional small-scale studies of the known sensory interventions will likely do little to advance the field. However, there is a need for multi-armed studies which directly compare sensory interventions, and more studies which investigate the use of layering sensory interventions (for example, adding an aromatherapy component to a lighting intervention). In addition, large scale studies which enroll people at early stages of dementia will help us better understand the potential of sensory and multisensory interventions to slow the progression of the disease.Keywords: sensory interventions, dementia, scoping review
Procedia PDF Downloads 134558 Genetic Diversity of Termite (Isoptera) Fauna of Western Ghats of India
Authors: A. S. Vidyashree, C. M. Kalleshwaraswamy, R. Asokan, H. M. Mahadevaswamy
Abstract:
Termites are very vital ecological thespians in tropical ecosystem, having been designated as “ecosystem engineers”, due to their significant role in providing soil ecosystem services. Despite their importance, our understanding of a number of their basic biological processes in termites is extremely limited. Developing a better understanding of termite biology is closely dependent upon consistent species identification. At present, identification of termites is relied on soldier castes. But for many species, soldier caste is not reported, that creates confusion in identification. The use of molecular markers may be helpful in estimating phylogenetic relatedness between the termite species and estimating genetic differentiation among local populations within each species. To understand this, termites samples were collected from various places of Western Ghats covering four states namely Karnataka, Kerala, Tamil Nadu, Maharashtra during 2013-15. Termite samples were identified based on their morphological characteristics, molecular characteristics, or both. Survey on the termite fauna in Karnataka, Kerala, Maharashtra and Tamil Nadu indicated the presence of a 16 species belongs to 4 subfamilies under two families viz., Rhinotermitidae and Termitidae. Termititidae was the dominant family which was belonging to 4 genera and four subfamilies viz., Macrotermitinae, Amitermitinae, Nasutitermitinae and Termitinae. Amitermitinae had three species namely, Microcerotermes fletcheri, M. pakistanicus and Speculitermes sinhalensis. Macrotermitinae had the highest number of species belonging two genera, namely Microtermes and Odontotermes. Microtermes genus was with only one species i.e., Microtermes obesi. The genus Odontotermes was represented by the highest number of species (07), namely, O. obesus was the dominant (41 per cent) and the most widely distributed species in Karnataka, Karala, Maharashtra and Tamil nadu followed by O. feae (19 per cent), O.assmuthi (11 per cent) and others like O. bellahunisensis O. horni O. redemanni, O. yadevi. Nasutitermitinae was represented by two genera namely Nasutitermes anamalaiensis and Trinervitermes biformis. Termitinae subfamily was represented by Labiocapritermes distortus. Rhinotermitidae was represented by single subfamily Heterotermetinae. In Heterotermetinae, two species namely Heterotermes balwanthi and H. malabaricus were recorded. Genetic relationship among termites collected from various locations of Western Ghats of India was characterized based on mitochondrial DNA sequences (12S, 16S, and COII). Sequence analysis and divergence among the species was assessed. These results suggest that the use of both molecular and morphological approaches is crucial in ensuring accurate species identification. Efforts were made to understand their evolution and to address the ambiguities in morphological taxonomy. The implication of the study in revising the taxonomy of Indian termites, their characterization and molecular comparisons between the sequences are discussed.Keywords: isoptera, mitochondrial DNA sequences, rhinotermitidae, termitidae, Western ghats
Procedia PDF Downloads 266557 Interacting with Multi-Scale Structures of Online Political Debates by Visualizing Phylomemies
Authors: Quentin Lobbe, David Chavalarias, Alexandre Delanoe
Abstract:
The ICT revolution has given birth to an unprecedented world of digital traces and has impacted a wide number of knowledge-driven domains such as science, education or policy making. Nowadays, we are daily fueled by unlimited flows of articles, blogs, messages, tweets, etc. The internet itself can thus be considered as an unsteady hyper-textual environment where websites emerge and expand every day. But there are structures inside knowledge. A given text can always be studied in relation to others or in light of a specific socio-cultural context. By way of their textual traces, human beings are calling each other out: hypertext citations, retweets, vocabulary similarity, etc. We are in fact the architects of a giant web of elements of knowledge whose structures and shapes convey their own information. The global shapes of these digital traces represent a source of collective knowledge and the question of their visualization remains an opened challenge. How can we explore, browse and interact with such shapes? In order to navigate across these growing constellations of words and texts, interdisciplinary innovations are emerging at the crossroad between fields of social and computational sciences. In particular, complex systems approaches make it now possible to reconstruct the hidden structures of textual knowledge by means of multi-scale objects of research such as semantic maps and phylomemies. The phylomemy reconstruction is a generic method related to the co-word analysis framework. Phylomemies aim to reveal the temporal dynamics of large corpora of textual contents by performing inter-temporal matching on extracted knowledge domains in order to identify their conceptual lineages. This study aims to address the question of visualizing the global shapes of online political discussions related to the French presidential and legislative elections of 2017. We aim to build phylomemies on top of a dedicated collection of thousands of French political tweets enriched with archived contemporary news web articles. Our goal is to reconstruct the temporal evolution of online debates fueled by each political community during the elections. To that end, we want to introduce an iterative data exploration methodology implemented and tested within the free software Gargantext. There we combine synchronic and diachronic axis of visualization to reveal the dynamics of our corpora of tweets and web pages as well as their inner syntagmatic and paradigmatic relationships. In doing so, we aim to provide researchers with innovative methodological means to explore online semantic landscapes in a collaborative and reflective way.Keywords: online political debate, French election, hyper-text, phylomemy
Procedia PDF Downloads 186556 Molecular Dynamics Simulation of Realistic Biochar Models with Controlled Microporosity
Authors: Audrey Ngambia, Ondrej Masek, Valentina Erastova
Abstract:
Biochar is an amorphous carbon-rich material generated from the pyrolysis of biomass with multifarious properties and functionality. Biochar has shown proven applications in the treatment of flue gas and organic and inorganic pollutants in soil and water/wastewater as a result of its multiple surface functional groups and porous structures. These properties have also shown potential in energy storage and carbon capture. The availability of diverse sources of biomass to produce biochar has increased interest in it as a sustainable and environmentally friendly material. The properties and porous structures of biochar vary depending on the type of biomass and high heat treatment temperature (HHT). Biochars produced at HHT between 400°C – 800°C generally have lower H/C and O/C ratios, higher porosities, larger pore sizes and higher surface areas with temperature. While all is known experimentally, there is little knowledge on the porous role structure and functional groups play on processes occurring at the atomistic scale, which are extremely important for the optimization of biochar for application, especially in the adsorption of gases. Atomistic simulations methods have shown the potential to generate such amorphous materials; however, most of the models available are composed of only carbon atoms or graphitic sheets, which are very dense or with simple slit pores, all of which ignore the important role of heteroatoms such as O, N, S and pore morphologies. Hence, developing realistic models that integrate these parameters are important to understand their role in governing adsorption mechanisms that will aid in guiding the design and optimization of biochar materials for target applications. In this work, molecular dynamics simulations in the isobaric ensemble are used to generate realistic biochar models taking into account experimentally determined H/C, O/C, N/C, aromaticity, micropore size range, micropore volumes and true densities of biochars. A pore generation approach was developed using virtual atoms, which is a Lennard-Jones sphere of varying van der Waals radius and softness. Its interaction via a soft-core potential with the biochar matrix allows the creation of pores with rough surfaces while varying the van der Waals radius parameters gives control to the pore-size distribution. We focused on microporosity, creating average pore sizes of 0.5 - 2 nm in diameter and pore volumes in the range of 0.05 – 1 cm3/g, which corresponds to experimental gas adsorption micropore sizes of amorphous porous biochars. Realistic biochar models with surface functionalities, micropore size distribution and pore morphologies were developed, and they could aid in the study of adsorption processes in confined micropores.Keywords: biochar, heteroatoms, micropore size, molecular dynamics simulations, surface functional groups, virtual atoms
Procedia PDF Downloads 71555 Next-Generation Disability Management: Diverse and Inclusive Strategies for All
Authors: Nidhi Malshe
Abstract:
Background: Currently, there are approximately 1.3 billion individuals worldwide living with significant disabilities, which accounts for 16% of the global population—about 1 in 6 people. As the global population continues to grow, so does the number of people experiencing disabilities. Traffic accidents alone contribute to millions of injuries and disabilities each year, particularly among young people. Additionally, as life expectancy rises, more individuals are likely to experience disabilities in their later years. 27.0% of Canadians aged 15 and over, or 8 million people, had at least one disability in 2022. This represents an increase of 4.7 percentage points from 2017. A person with a disability earns 21.4% less on average as compared to a person without a disability. Using innovative and inclusive methods for accommodations, disability management, and employment, we can progress towards inclusive workplaces and potential income parity for this equity-seeking population. Objective: This study embraces innovative and inclusive approaches to disability management, thereby unlocking the advantages associated with a) fostering equal opportunities for all individuals, b) facilitating streamlined accommodations and making it easier for companies to accommodate people with disabilities, c) harnessing diverse perspectives to drive innovation and enhance overall productivity. Methodology: Literature review, assessments of specific needs and requirements in the workplace. a) Encourage the ability to think out of the box for potential workplace accommodations based on the specific needs of individuals. e.g., propose prolonged integration post disability. b) Perform a cost-benefit analysis of early interventions of return to work vs. duration on disability. c) Expand the scope of vocational assessment/retraining – e.g., retraining a person with permanent physical impairment to become a video game coder. d) Leverage the use of technology while planning to return to work e.g., speech-to-text software for persons with voice impairments. Hypothesized Results: Prolonged progression of return to work increases the potential for sustainable and productive employment. Co-developing a person-centric accommodation plan based on reported functional abilities and applying pioneering methods for extending accommodations to prevent secondary disabilities. Facilitate a sense of belonging by providing employees with benefits and initiatives that honor their unique contributions. Engage individuals with disabilities as active members of the planning committee to ensure the development of innovative and inclusive accommodations that address the needs of all. Conclusion: The global pandemic underscored the need for creativity in our daily routine. It is imperative to integrate the lessons learned from the pandemic, enhance them within employment, and return to work processes. These learnings can also be used to develop creative, distinct methods to ensure equal opportunities for everyone.Keywords: disbaility management, diversity, inclusion, innovation
Procedia PDF Downloads 15554 Islam and Democracy: A Paradoxical Study of Syed Maududi and Javed Ghamidi
Authors: Waseem Makai
Abstract:
The term ‘political Islam’ now seem to have gained the centre stage in every discourse pertaining to Islamic legitimacy and compatibility in modern civilisations. A never ceasing tradition of the philosophy of caliphate that has kept overriding the options of any alternate political institution in the Muslim world still permeates a huge faction of believers. Fully accustomed with the proliferation of changes and developments in individual, social and natural dispositions of the world, Islamic theologians retaliated to this flux through both conventional and modernist approaches. The so-called conventional approach was quintessential of the interpretations put forth by Syed Maududi, with new comprehensive, academic and powerful vigour, as never seen before. He generated the avant-garde scholarship which would bear testimony to his statements, made to uphold the political institution of Islam as supreme and noble. However, it was not his trait to challenge the established views but to codify them in such a bracket which a man of the 20th century would find captivating to his heart and satisfactory to his rationale. The delicate microcosms like selection of a caliph, implementation of Islamic commandments (Sharia), interest free banking sectors, imposing tax (Jazyah) on non-believers, waging the holy crusade (Jihad) for the expansion of Islamic boundaries, stoning for committing adulteration and capital punishment for apostates were all there in his scholarship which he spent whole of his life defending in the best possible manner. What and where did he went wrong with all this, was supposedly to be notified later, by his once been disciple, Javed Ahmad Ghamidi. Ghamidi is being accused of struggling between Scylla and Charybdis as he tries to remain steadfast to his basic Islamic tenets while modernising their interpretations to bring them in harmony with the Western ideals of democracy and liberty. His blatant acknowledgement of putting democracy at a high pedestal, calling the implementation of Sharia a non-mandatory task and denial to bracket people in the categories of Zimmi and Kaafir fully vindicates his stance against conventional narratives like that of Syed Maududi. Ghamidi goes to the extent of attributing current forms of radicalism and extremism, as exemplified in the operations of organisations like ISIS in Iraq and Syria and Tehreek-e-Taliban in Pakistan, to such a version of political Islam as upheld not only by Syed Maududi but by other prominent theologians like Ibn-Timyah, Syed Qutub and Dr. Israr Ahmad also. Ghamidi is wretched, in a way that his allegedly insubstantial claims gained him enough hostilities to leave his homeland when two of his close allies were brutally murdered. Syed Maududi and Javed Ghamidi, both stand poles apart in their understanding of Islam and its political domain. Who has the appropriate methodology, scholarship and execution in his mode of comprehension, is an intriguing task, worth carrying out in detail.Keywords: caliphate, democracy, ghamidi, maududi
Procedia PDF Downloads 200553 Valorisation of Food Waste Residue into Sustainable Bioproducts
Authors: Krishmali N. Ekanayake, Brendan J. Holland, Colin J. Barrow, Rick Wood
Abstract:
Globally, more than one-third of all food produced is lost or wasted, equating to 1.3 billion tonnes per year. Around 31.2 million tonnes of food waste are generated across the production, supply, and consumption chain in Australia. Generally, the food waste management processes adopt environmental-friendly and more sustainable approaches such as composting, anerobic digestion and energy implemented technologies. However, unavoidable, and non-recyclable food waste ends up as landfilling and incineration that involve many undesirable impacts and challenges on the environment. A biorefinery approach contributes to a waste-minimising circular economy by converting food and other organic biomass waste into valuable outputs, including feeds, nutrition, fertilisers, and biomaterials. As a solution, Green Eco Technologies has developed a food waste treatment process using WasteMaster system. The system uses charged oxygen and moderate temperatures to convert food waste, without bacteria, additives, or water, into a virtually odour-free, much reduced quantity of reusable residual material. In the context of a biorefinery, the WasteMaster dries and mills food waste into a form suitable for storage or downstream extraction/separation/concentration to create products. The focus of the study is to determine the nutritional composition of WasteMaster processed residue to potential develop aquafeed ingredients. The global aquafeed industry is projected to reach a high value market in future, which has shown high demand for the aquafeed products. Therefore, food waste can be utilized for aquaculture feed development by reducing landfill. This framework will lessen the requirement of raw crops cultivation for aquafeed development and reduce the aquaculture footprint. In the present study, the nutritional elements of processed residue are consistent with the input food waste type, which has shown that the WasteMaster is not affecting the expected nutritional distribution. The macronutrient retention values of protein, lipid, and nitrogen free extract (NFE) are detected >85%, >80%, and >95% respectively. The sensitive food components including omega 3 and omega 6 fatty acids, amino acids, and phenolic compounds have been found intact in each residue material. Preliminary analysis suggests a price comparability with current aquafeed ingredient cost making the economic feasibility. The results suggest high potentiality of aquafeed development as 5 to 10% of the ingredients to replace/partially substitute other less sustainable ingredients across biorefinery setting. Our aim is to improve the sustainability of aquaculture and reduce the environmental impacts of food waste.Keywords: biorefinery, ffood waste residue, input, wasteMaster
Procedia PDF Downloads 67552 Urban Open Source: Synthesis of a Citizen-Centric Framework to Design Densifying Cities
Authors: Shaurya Chauhan, Sagar Gupta
Abstract:
Prominent urbanizing centres across the globe like Delhi, Dhaka, or Manila have exhibited that development often faces a challenge in bridging the gap among the top-down collective requirements of the city and the bottom-up individual aspirations of the ever-diversifying population. When this exclusion is intertwined with rapid urbanization and diversifying urban demography: unplanned sprawl, poor planning, and low-density development emerge as automated responses. In parallel, new ideas and methods of densification and public participation are being widely adopted as sustainable alternatives for the future of urban development. This research advocates a collaborative design method for future development: one that allows rapid application with its prototypical nature and an inclusive approach with mediation between the 'user' and the 'urban', purely with the use of empirical tools. Building upon the concepts and principles of 'open-sourcing' in design, the research establishes a design framework that serves the current user requirements while allowing for future citizen-driven modifications. This is synthesized as a 3-tiered model: user needs – design ideology – adaptive details. The research culminates into a context-responsive 'open source project development framework' (hereinafter, referred to as OSPDF) that can be used for on-ground field applications. To bring forward specifics, the research looks at a 300-acre redevelopment in the core of a rapidly urbanizing city as a case encompassing extreme physical, demographic, and economic diversity. The suggestive measures also integrate the region’s cultural identity and social character with the diverse citizen aspirations, using architecture and urban design tools, and references from recognized literature. This framework, based on a vision – feedback – execution loop, is used for hypothetical development at the five prevalent scales in design: master planning, urban design, architecture, tectonics, and modularity, in a chronological manner. At each of these scales, the possible approaches and avenues for open- sourcing are identified and validated, through hit-and-trial, and subsequently recorded. The research attempts to re-calibrate the architectural design process and make it more responsive and people-centric. Analytical tools such as Space, Event, and Movement by Bernard Tschumi and Five-Point Mental Map by Kevin Lynch, among others, are deep rooted in the research process. Over the five-part OSPDF, a two-part subsidiary process is also suggested after each cycle of application, for a continued appraisal and refinement of the framework and urban fabric with time. The research is an exploration – of the possibilities for an architect – to adopt the new role of a 'mediator' in development of the contemporary urbanity.Keywords: open source, public participation, urbanization, urban development
Procedia PDF Downloads 149551 Multi-Label Approach to Facilitate Test Automation Based on Historical Data
Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally
Abstract:
The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.Keywords: machine learning, multi-class, multi-label, supervised learning, test automation
Procedia PDF Downloads 132550 Valorization of Banana Peels for Mercury Removal in Environmental Realist Conditions
Authors: E. Fabre, C. Vale, E. Pereira, C. M. Silva
Abstract:
Introduction: Mercury is one of the most troublesome toxic metals responsible for the contamination of the aquatic systems due to its accumulation and bioamplification along the food chain. The 2030 agenda for sustainable development of United Nations promotes the improving of water quality by reducing water pollution and foments an enhance in wastewater treatment, encouraging their recycling and safe water reuse globally. Sorption processes are widely used in wastewater treatments due to their many advantages such as high efficiency and low operational costs. In these processes the target contaminant is removed from the solution by a solid sorbent. The more selective and low cost is the biosorbent the more attractive becomes the process. Agricultural wastes are especially attractive approaches for sorption. They are largely available, have no commercial value and require little or no processing. In this work, banana peels were tested for mercury removal from low concentrated solutions. In order to investigate the applicability of this solid, six water matrices were used increasing the complexity from natural waters to a real wastewater. Studies of kinetics and equilibrium were also performed using the most known models to evaluate the viability of the process In line with the concept of circular economy, this study adds value to this by-product as well as contributes to liquid waste management. Experimental: The solutions were prepared with Hg(II) initial concentration of 50 µg L-1 in natural waters, at 22 ± 1 ºC, pH 6, magnetically stirring at 650 rpm and biosorbent mass of 0.5 g L-1. NaCl was added to obtain the salt solutions, seawater was collected from the Portuguese coast and the real wastewater was kindly provided by ISQ - Instituto de Soldadura e qualidade (Welding and Quality Institute) and diluted until the same concentration of 50 µg L-1. Banana peels were previously freeze-drying, milled, sieved and the particles < 1 mm were used. Results: Banana peels removed more than 90% of Hg(II) from all the synthetic solutions studied. In these cases, the enhance in the complexity of the water type promoted a higher mercury removal. In salt waters, the biosorbent showed removals of 96%, 95% and 98 % for 3, 15 and 30 g L-1 of NaCl, respectively. The residual concentration of Hg(II) in solution achieved the level of drinking water regulation (1 µg L-1). For real matrices, the lower Hg(II) elimination (93 % for seawater and 81 % for the real wastewaters), can be explained by the competition between the Hg(II) ions and the other elements present in these solutions for the sorption sites. Regarding the equilibrium study, the experimental data are better described by the Freundlich isotherm (R ^ 2=0.991). The Elovich equation provided the best fit to the kinetic points. Conclusions: The results exhibited the great ability of the banana peels to remove mercury. The environmental realist conditions studied in this work, highlight their potential usage as biosorbents in water remediation processes.Keywords: banana peels, mercury removal, sorption, water treatment
Procedia PDF Downloads 155549 Multilevel Regression Model - Evaluate Relationship Between Early Years’ Activities of Daily Living and Alzheimer’s Disease Onset Accounting for Influence of Key Sociodemographic Factors Using a Longitudinal Household Survey Data
Authors: Linyi Fan, C.J. Schumaker
Abstract:
Background: Biomedical efforts to treat Alzheimer’s disease (AD) have typically produced mixed to poor results, while more lifestyle-focused treatments such as exercise may fare better than existing biomedical treatments. A few promising studies have indicated that activities of daily life (ADL) may be a useful way of predicting AD. However, the existing cross-sectional studies fail to show how functional-related issues such as ADL in early years predict AD and how social factors influence health either in addition to or in interaction with individual risk factors. This study would helpbetterscreening and early treatments for the elderly population and healthcare practice. The findings have significance academically and practically in terms of creating positive social change. Methodology: The purpose of this quantitative historical, correlational study was to examine the relationship between early years’ ADL and the development of AD in later years. The studyincluded 4,526participantsderived fromRAND HRS dataset. The Health and Retirement Study (HRS) is a longitudinal household survey data set that is available forresearchof retirement and health among the elderly in the United States. The sample was selected by the completion of survey questionnaire about AD and dementia. The variablethat indicates whether the participant has been diagnosed with AD was the dependent variable. The ADL indices and changes in ADL were the independent variables. A four-step multilevel regression model approach was utilized to address the research questions. Results: Amongst 4,526 patients who completed the AD and dementia questionnaire, 144 (3.1%) were diagnosed with AD. Of the 4,526 participants, 3,465 (76.6%) have high school and upper education degrees,4,074 (90.0%) were above poverty threshold. The model evaluatedthe effect of ADL and change in ADL on onset of AD in late years while allowing the intercept of the model to vary by level of education. The results suggested that the only significant predictor of the onset of AD was changes in early years’ ADL (b = 20.253, z = 2.761, p < .05). However, the result of the sensitivity analysis (b = 7.562, z = 1.900, p =.058), which included more control variables and increased the observation period of ADL, are not supported this finding. The model also estimated whether the variances of random effect vary by Level-2 variables. The results suggested that the variances associated with random slopes were approximately zero, suggesting that the relationship between early years’ ADL were not influenced bysociodemographic factors. Conclusion: The finding indicated that an increase in changes in ADL leads to an increase in the probability of onset AD in the future. However, this finding is not support in a broad observation period model. The study also failed to reject the hypothesis that the sociodemographic factors explained significant amounts of variance in random effect. Recommendations were then made for future research and practice based on these limitations and the significance of the findings.Keywords: alzheimer’s disease, epidemiology, moderation, multilevel modeling
Procedia PDF Downloads 135548 Kansei Engineering Applied to the Design of Rural Primary Education Classrooms: Design-Based Learning Case
Authors: Jimena Alarcon, Andrea Llorens, Gabriel Hernandez, Maritza Palma, Lucia Navarrete
Abstract:
The research has funding from the Government of Chile and is focused on defining the design of rural primary classroom that stimulates creativity. The relevance of the study consists of its capacity to define adequate educational spaces for the implementation of the design-based learning (DBL) methodology. This methodology promotes creativity and teamwork, generating a meaningful learning experience for students, based on the appreciation of their environment and the generation of projects that contribute positively to their communities; also, is an inquiry-based form of learning that is based on the integration of design thinking and the design process into the classroom. The main goal of the study is to define the design characteristics of rural primary school classrooms, associated with the implementation of the DBL methodology. Along with the change in learning strategies, it is necessary to change the educational spaces in which they develop. The hypothesis indicates that a change in the space and equipment of the classrooms based on the emotions of the students will motivate better learning results based on the implementation of a new methodology. In this case, the pedagogical dynamics require an important interaction between the participants, as well as an environment favorable to creativity. Methodologies from Kansei engineering are used to know the emotional variables associated with their definition. The study is done to 50 students between 6 and 10 years old (average age of seven years), 48% of men and 52% women. Virtual three-dimensional scale models and semantic differential tables are used. To define the semantic differential, self-applied surveys were carried out. Each survey consists of eight separate questions in two groups: question A to find desirable emotions; question B related to emotions. Both questions have a maximum of three alternatives to answer. Data were tabulated with IBM SPSS Statistics version 19. Terms referred to emotions are grouped into twenty concepts with a higher presence in surveys. To select the values obtained as part of the implementation of Semantic Differential, a number expected of 'chi-square test (x2)' frequency calculated for classroom space is considered lower limit. All terms over the N expected a cut point, are included to prepare tables for surveys to find a relation between emotion and space. Statistic contrast (Chi-Square) represents significance level ≥ 0, indicator that frequencies appeared are not random. Then, the most representative terms depend on the variable under study: a) definition of textures and color of vertical surfaces is associated with emotions such as tranquility, attention, concentration, creativity; and, b) distribution of the equipment of the rooms, with emotions associated with happiness, distraction, creativity, freedom. The main findings are linked to the generation of classrooms according to diverse DBL team dynamics. Kansei engineering is the appropriate methodology to know the emotions that students want to feel in the classroom space.Keywords: creativity, design-based learning, education spaces, emotions
Procedia PDF Downloads 142547 Building Community through Discussion Forums in an Online Accelerated MLIS Program: Perspectives of Instructors and Students
Authors: Mary H Moen, Lauren H. Mandel
Abstract:
Creating a sense of community in online learning is important for student engagement and success. The integration of discussion forums within online learning environments presents an opportunity to explore how this computer mediated communications format can cultivate a sense of community among students in accelerated master’s degree programs. This research has two aims, to delve into the ways instructors utilize this communications technology to create community and to understand the feelings and experiences of graduate students participating in these forums in regard to its effectiveness in community building. This study is a two-phase approach encompassing qualitative and quantitative methodologies. The data will be collected at an online accelerated Master of Library and Information Studies program at a public university in the northeast of the United States. Phase 1 is a content analysis of the syllabi from all courses taught in the 2023 calendar year, which explores the format and rules governing discussion forum assignments. Four to six individual interviews of department faculty and part time faculty will also be conducted to illuminate their perceptions of the successes and challenges of their discussion forum activities. Phase 2 will be an online survey administered to students in the program during the 2023 calendar year. Quantitative data will be collected for statistical analysis, and short answer responses will be analyzed for themes. The survey is adapted from the Classroom Community Scale Short-Form (CSS-SF), which measures students' self-reported responses on their feelings of connectedness and learning. The prompts will contextualize the items from their experience in discussion forums during the program. Short answer responses on the challenges and successes of using discussion forums will be analyzed to gauge student perceptions and experiences using this type of communication technology in education. This research study is in progress. The authors anticipate that the findings will provide a comprehensive understanding of the varied approaches instructors use in discussion forums for community-building purposes in an accelerated MLIS program. They predict that the more varied, flexible, and consistent student uses of discussion forums are, the greater the sense of community students will report. Additionally, students’ and instructors’ perceptions and experiences within these forums will shed light on the successes and challenges faced, thereby offering valuable recommendations for enhancing online learning environments. The findings are significant because they can contribute actionable insights for instructors, educational institutions, and curriculum designers aiming to optimize the use of discussion forums in online accelerated graduate programs, ultimately fostering a richer and more engaging learning experience for students.Keywords: accelerated online learning, discussion forums, LIS programs, sense of community, g
Procedia PDF Downloads 84546 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study
Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming
Abstract:
Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.Keywords: binary outcomes, statistical methods, clinical trials, simulation study
Procedia PDF Downloads 114545 Survey of the Literacy by Radio Project as an Innovation in Literacy Promotion in Nigeria
Authors: Stella Chioma Nwizu
Abstract:
The National Commission for Adult and Non Formal Education (NMEC) in Nigeria is charged with the reduction of illiteracy rate through the development, monitoring, and supervision of literacy programmes in Nigeria. In spite of various efforts by NMEC to reduce illiteracy, literature still shows that the illiteracy rate is still high. According to NMEC/UNICEF, about 60 million Nigerians are non-literate, and nearly two thirds of them are women. This situation forced the government to search for innovative and better approaches to literacy promotion and delivery. The literacy by radio project was adopted as an innovative intervention to literacy delivery in Nigeria because the radio is the cheapest and most easily affordable medium for non-literates. The project aimed at widening access to literacy programmes for the non-literate marginalized and disadvantaged groups in Nigeria by taking literacy programmes to their door steps. The literacy by radio has worked perfectly well in non-literacy reduction in Cuba. This innovative intervention of literacy by radio is anchored on the diffusion of innovation theory by Rogers. The literacy by radio has been going on for fifteen years and the efficacy and contributions of this innovation need to be investigated. Thus, the purpose of this research is to review the contributions of the literacy by radio in Nigeria. The researcher adopted the survey research design for the study. The population for the study consisted of 2,706 participants and 47 facilitators of the literacy by radio programme in the 10 pilot states in Nigeria. A sample of four states made up of 302 participants and eight facilitators were used for the study. Information was collected through Focus Group Discussion (FGD), interviews and content analysis of official documents. The data were analysed qualitatively to review the contributions of literacy by radio project and determine the efficacy of this innovative approach in facilitating literacy in Nigeria. Results from the field experience showed, among others, that more non-literates have better access to literacy programmes through this innovative approach. The pilot project was 88% successful; not less than 2,110 adults were made literate through the literacy by radio project in 2017. However, lack of enthusiasm and commitment on the part of the technical committee and facilitators due to non-payment of honorarium, poor signals from radio stations, interruption of lectures with adverts, low community involvement in decision making in the project are challenges to the success rate of the project. The researcher acknowledges the need to customize all materials and broadcasts in all the dialects of the participants and the inclusion of more civil rights, environmental protection and agricultural skills into the project. The study recommends among others, improved and timely funding of the project by the Federal Government to enable NMEC to fulfill her obligations towards the greater success of the programme, setting up of independent radio stations for airing the programmes and proper monitoring and evaluation of the project by NMEC and State Agencies for greater effectiveness. In an era of the knowledge-driven economy, no one should be allowed to get saddled with the weight of illiteracy.Keywords: innovative approach, literacy, project, radio, survey
Procedia PDF Downloads 65544 Method for Requirements Analysis and Decision Making for Restructuring Projects in Factories
Authors: Rene Hellmuth
Abstract:
The requirements for the factory planning and the building concerned have changed in the last years. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring gains more importance in order to maintain the competitiveness of a factory. Restrictions regarding new areas, shorter life cycles of product and production technology as well as a VUCA (volatility, uncertainty, complexity and ambiguity) world cause more frequently occurring rebuilding measures within a factory. Restructuring of factories is the most common planning case today. Restructuring is more common than new construction, revitalization and dismantling of factories. The increasing importance of restructuring processes shows that the ability to change was and is a promising concept for the reaction of companies to permanently changing conditions. The factory building is the basis for most changes within a factory. If an adaptation of a construction project (factory) is necessary, the inventory documents must be checked and often time-consuming planning of the adaptation must take place to define the relevant components to be adapted, in order to be able to finally evaluate them. The different requirements of the planning participants from the disciplines of factory planning (production planner, logistics planner, automation planner) and industrial construction planning (architect, civil engineer) come together during reconstruction and must be structured. This raises the research question: Which requirements do the disciplines involved in the reconstruction planning place on a digital factory model? A subordinate research question is: How can model-based decision support be provided for a more efficient design of the conversion within a factory? Because of the high adaptation rate of factories and its building described above, a methodology for rescheduling factories based on the requirements engineering method from software development is conceived and designed for practical application in factory restructuring projects. The explorative research procedure according to Kubicek is applied. Explorative research is suitable if the practical usability of the research results has priority. Furthermore, it will be shown how to best use a digital factory model in practice. The focus will be on mobile applications to meet the needs of factory planners on site. An augmented reality (AR) application will be designed and created to provide decision support for planning variants. The aim is to contribute to a shortening of the planning process and model-based decision support for more efficient change management. This requires the application of a methodology that reduces the deficits of the existing approaches. The time and cost expenditure are represented in the AR tablet solution based on a building information model (BIM). Overall, the requirements of those involved in the planning process for a digital factory model in the case of restructuring within a factory are thus first determined in a structured manner. The results are then applied and transferred to a construction site solution based on augmented reality.Keywords: augmented reality, digital factory model, factory planning, restructuring
Procedia PDF Downloads 134543 Precursor Synthesis of Carbon Materials with Different Aggregates Morphologies
Authors: Nikolai A. Khlebnikov, Vladimir N. Krasilnikov, Evgenii V. Polyakov, Anastasia A. Maltceva
Abstract:
Carbon materials with advanced surfaces are widely used both in modern industry and in environmental protection. The physical-chemical nature of these materials is determined by the morphology of primary atomic and molecular carbon structures, which are the basis for synthesizing the following materials: zero-dimensional (fullerenes), one-dimensional (fiber, tubes), two-dimensional (graphene) carbon nanostructures, three-dimensional (multi-layer graphene, graphite, foams) with unique physical-chemical and functional properties. Experience shows that the microscopic morphological level is the basis for the creation of the next mesoscopic morphological level. The dependence of the morphology on the chemical way and process prehistory (crystallization, colloids formation, liquid crystal state and other) is the peculiarity of the last called level. These factors determine the consumer properties of carbon materials, such as specific surface area, porosity, chemical resistance in corrosive environments, catalytic and adsorption activities. Based on the developed ideology of thin precursor synthesis, the authors discuss one of the approaches of the porosity control of carbon-containing materials with a given aggregates morphology. The low-temperature thermolysis of precursors in a gas environment of a given composition is the basis of the above-mentioned idea. The processes of carbothermic precursor synthesis of two different compounds: tungsten carbide WC:nC and zinc oxide ZnO:nC containing an impurity phase in the form of free carbon were selected as subjects of the research. In the first case, the transition metal (tungsten) forming carbides was the object of the synthesis. In the second case, there was selected zinc that does not form carbides. The synthesis of both kinds of transition metals compounds was conducted by the method of precursor carbothermic synthesis from the organic solution. ZnO:nC composites were obtained by thermolysis of succinate Zn(OO(CH2)2OO), formate glycolate Zn(HCOO)(OCH2CH2O)1/2, glycerolate Zn(OCH2CHOCH2OH), and tartrate Zn(OOCCH(OH)CH(OH)COO). WC:nC composite was synthesized from ammonium paratungstate and glycerol. In all cases, carbon structures that are specific for diamond- like carbon forms appeared on the surface of WC and ZnO particles after the heat treatment. Tungsten carbide and zinc oxide were removed from the composites by selective chemical dissolution preserving the amorphous carbon phase. This work presents the results of investigating WC:nC and ZnO:nC composites and carbon nanopowders with tubular, tape, plate and onion morphologies of aggregates that are separated by chemical dissolution of WC and ZnO from the composites by the following methods: SEM, TEM, XPA, Raman spectroscopy, and BET. The connection between the carbon morphology under the conditions of synthesis and chemical nature of the precursor and the possibility of regulation of the morphology with the specific surface area up to 1700-2000 m2/g of carbon-structured materials are discussed.Keywords: carbon morphology, composite materials, precursor synthesis, tungsten carbide, zinc oxide
Procedia PDF Downloads 335542 The Structural Alteration of DNA Native Structure of Staphylococcus aureus Bacteria by Designed Quinoxaline Small Molecules Result in Their Antibacterial Properties
Authors: Jeet Chakraborty, Sanjay Dutta
Abstract:
Antibiotic resistance by bacteria has proved to be a severe threat to mankind in recent times, and this fortifies an urgency to design and develop potent antibacterial small molecules/compounds with nonconventional mechanisms than the conventional ones. DNA carries the genetic signature of any organism, and bacteria maintain their genomic DNA inside the cell in a well-regulated compact form with the help of various nucleoid associated proteins like HU, HNS, etc. These proteins control various fundamental processes like gene expression, replication, etc., inside the cell. Alteration of the native DNA structure of bacteria can lead to severe consequences in cellular processes inside the bacterial cell that ultimately result in the death of the organism. The change in the global DNA structure by small molecules initiates a plethora of cellular responses that have not been very well investigated. Echinomycin and Triostin-A are biologically active Quinoxaline small molecules that typically consist of a quinoxaline chromophore attached with an octadepsipeptide ring. They bind to double-stranded DNA in a sequence-specific way and have high activity against a wide variety of bacteria, mainly against Gram-positive ones. To date, few synthetic quinoxaline scaffolds were synthesized, displaying antibacterial potential against a broad scale of pathogenic bacteria. QNOs (Quinoxaline N-oxides) are known to target DNA and instigate reactive oxygen species (ROS) production in bacteria, thereby exhibiting antibacterial properties. The divergent role of Quinoxaline small molecules in medicinal research qualifies them for the evaluation of their antimicrobial properties as a potential candidate. The previous study from our lab has given new insights on a 6-nitroquinoxaline derivative 1d as an intercalator of DNA, which induces conformational changes in DNA upon binding.7 The binding event observed was dependent on the presence of a crucial benzyl substituent on the quinoxaline moiety. This was associated with a large induced CD (ICD) appearing in a sigmoidal pattern upon the interaction of 1d with dsDNA. The induction of DNA superstructures by 1d at high Drug:DNA ratios was observed that ultimately led to DNA condensation. Eviction of invitro-assembled nucleosome upon treatment with a high dose of 1d was also observed. In this work, monoquinoxaline derivatives of 1d were synthesized by various modifications of the 1d scaffold. The set of synthesized 6-nitroquinoxaline derivatives along with 1d were all subjected to antibacterial evaluation across five different bacteria species. Among the compound set, 3a displayed potent antibacterial activity against Staphylococcus aureus bacteria. 3a was further subjected to various biophysical studies to check whether the DNA structural alteration potential was still intact. The biological response of S. aureus cells upon treatment with 3a was studied using various cell biology processes, which led to the conclusion that 3d can initiate DNA damage in the S. aureus cells. Finally, the potential of 3a in disrupting preformed S.aureus and S.epidermidis biofilms was also studied.Keywords: DNA structural change, antibacterial, intercalator, DNA superstructures, biofilms
Procedia PDF Downloads 169541 An Investigation of Tetraspanin Proteins’ Role in UPEC Infection
Authors: Fawzyah Albaldi
Abstract:
Urinary tract infections (UTIs) are the most prevalent of infectious diseases and > 80% are caused by uropathogenic E. coli (UPEC). Infection occurs following adhesion to urothelial plaques on bladder epithelial cells, whose major protein constituent are the uroplakins (UPs). Two of the four uroplakins (UPIa and UPIb) are members of the tetraspanin superfamily. The UPEC adhesin FimH is known to interact directly with UPIa. Tetraspanins are a diverse family of transmembrane proteins that generally act as “molecular organizers” by binding different proteins and lipids to form tetraspanin enriched microdomains (TEMs). Previous work by our group has shown that TEMs are involved in the adhesion of many pathogenic bacteria to human cells. Adhesion can be blocked by tetraspanin-derived synthetic peptides, suggesting that tetraspanins may be valuable drug targets. In this study, we investigate the role of tetraspanins in UPEC adherence to bladder epithelial cells. Human bladder cancer cell lines (T24, 5637, RT4), commonly used as in-vitro models to investigate UPEC infection, along with primary human bladder cells, were used in this project. The aim was to establish a model for UPEC adhesion/infection with the objective of evaluating the impact of tetraspanin-derived reagents on this process. Such reagents could reduce the progression of UTI, particularly in patients with indwelling catheters. Tetraspanin expression on the bladder cells was investigated by q-PCR and flow cytometry, with CD9 and CD81 generally highly expressed. Interestingly, despite these cell lines being used by other groups to investigate FimH antagonists, uroplakin proteins (UPIa, UPIb and UPIII) were poorly expressed at the cell surface, although some were present intracellularly. Attempts were made to differentiate the cell lines, to induce cell surface expression of these UPs, but these were largely unsuccessful. Pre-treatment of bladder epithelial cells with anti-CD9 monoclonal antibody significantly decreased UPEC infection, whilst anti-CD81 had no effects. A short (15aa) synthetic peptide corresponding to the large extracellular region (EC2) of CD9 also significantly reduced UPEC adherence. Furthermore, we demonstrated specific binding of that fluorescently tagged peptide to the cells. CD9 is known to associate with a number of heparan sulphate proteoglycans (HSPGs) that have also been implicated in bacterial adhesion. Here, we demonstrated that unfractionated heparin (UFH)and heparin analogs significantly inhibited UPEC adhesion to RT4 cells, as did pre-treatment of the cells with heparinases. Pre-treatment with chondroitin sulphate (CS) and chondroitinase also significantly decreased UPEC adherence to RT4 cells. This study may shed light on a common pathogenicity mechanism involving the organisation of HSPGs by tetraspanins. In summary, although we determined that the bladder cell lines were not suitable to investigate the role of uroplakins in UPEC adhesion, we demonstrated roles for CD9 and cell surface proteoglycans in this interaction. Agents that target these may be useful in treating/preventing UTIs.Keywords: UTIs, tspan, uroplakins, CD9
Procedia PDF Downloads 103540 Elastoplastic Modified Stillinger Weber-Potential Based Discretized Virtual Internal Bond and Its Application to the Dynamic Fracture Propagation
Authors: Dina Kon Mushid, Kabutakapua Kakanda, Dibu Dave Mbako
Abstract:
The failure of material usually involves elastoplastic deformation and fracturing. Continuum mechanics can effectively deal with plastic deformation by using a yield function and the flow rule. At the same time, it has some limitations in dealing with the fracture problem since it is a theory based on the continuous field hypothesis. The lattice model can simulate the fracture problem very well, but it is inadequate for dealing with plastic deformation. Based on the discretized virtual internal bond model (DVIB), this paper proposes a lattice model that can account for plasticity. DVIB is a lattice method that considers material to comprise bond cells. Each bond cell may have any geometry with a finite number of bonds. The two-body or multi-body potential can characterize the strain energy of a bond cell. The two-body potential leads to the fixed Poisson ratio, while the multi-body potential can overcome the limitation of the fixed Poisson ratio. In the present paper, the modified Stillinger-Weber (SW), a multi-body potential, is employed to characterize the bond cell energy. The SW potential is composed of two parts. One part is the two-body potential that describes the interatomic interactions between particles. Another is the three-body potential that represents the bond angle interactions between particles. Because the SW interaction can represent the bond stretch and bond angle contribution, the SW potential-based DVIB (SW-DVIB) can represent the various Poisson ratios. To embed the plasticity in the SW-DVIB, the plasticity is considered in the two-body part of the SW potential. It is done by reducing the bond stiffness to a lower level once the bond reaches the yielding point. While before the bond reaches the yielding point, the bond is elastic. When the bond deformation exceeds the yielding point, the bond stiffness is softened to a lower value. When unloaded, irreversible deformation occurs. With the bond length increasing to a critical value, termed the failure bond length, the bond fails. The critical failure bond length is related to the cell size and the macro fracture energy. By this means, the fracture energy is conserved so that the cell size sensitivity problem is relieved to a great extent. In addition, the plasticity and the fracture are also unified at the bond level. To make the DVIB able to simulate different Poisson ratios, the three-body part of the SW potential is kept elasto-brittle. The bond angle can bear the moment before the bond angle increment is smaller than a critical value. By this method, the SW-DVIB can simulate the plastic deformation and the fracturing process of material with various Poisson ratios. The elastoplastic SW-DVIB is used to simulate the plastic deformation of a material, the plastic fracturing process, and the tunnel plastic deformation. It has been shown that the current SW-DVIB method is straightforward in simulating both elastoplastic deformation and plastic fracture.Keywords: lattice model, discretized virtual internal bond, elastoplastic deformation, fracture, modified stillinger-weber potential
Procedia PDF Downloads 98539 CRM Cloud Computing: An Efficient and Cost Effective Tool to Improve Customer Interactions
Authors: Gaurangi Saxena, Ravindra Saxena
Abstract:
Lately, cloud computing is used to enhance the ability to attain corporate goals more effectively and efficiently at lower cost. This new computing paradigm “The Cloud Computing” has emerged as a powerful tool for optimum utilization of resources and gaining competitiveness through cost reduction and achieving business goals with greater flexibility. Realizing the importance of this new technique, most of the well known companies in computer industry like Microsoft, IBM, Google and Apple are spending millions of dollars in researching cloud computing and investigating the possibility of producing interface hardware for cloud computing systems. It is believed that by using the right middleware, a cloud computing system can execute all the programs a normal computer could run. Potentially, everything from most simple generic word processing software to highly specialized and customized programs designed for specific company could work successfully on a cloud computing system. A Cloud is a pool of virtualized computer resources. Clouds are not limited to grid environments, but also support “interactive user-facing applications” such as web applications and three-tier architectures. Cloud Computing is not a fundamentally new paradigm. It draws on existing technologies and approaches, such as utility Computing, Software-as-a-service, distributed computing, and centralized data centers. Some companies rent physical space to store servers and databases because they don’t have it available on site. Cloud computing gives these companies the option of storing data on someone else’s hardware, removing the need for physical space on the front end. Prominent service providers like Amazon, Google, SUN, IBM, Oracle, Salesforce etc. are extending computing infrastructures and platforms as a core for providing top-level services for computation, storage, database and applications. Application services could be email, office applications, finance, video, audio and data processing. By using cloud computing system a company can improve its customer relationship management. A CRM cloud computing system may be highly useful in delivering a sales team a blend of unique functionalities to improve agent/customer interactions. This paper attempts to first define the cloud computing as a tool for running business activities more effectively and efficiently at a lower cost; and then it distinguishes cloud computing with grid computing. Based on exhaustive literature review, authors discuss application of cloud computing in different disciplines of management especially in the field of marketing with special reference to use of cloud computing in CRM. Study concludes that CRM cloud computing platform helps a company track any data, such as orders, discounts, references, competitors and many more. By using CRM cloud computing, companies can improve its customer interactions and by serving them more efficiently that too at a lower cost can help gaining competitive advantage.Keywords: cloud computing, competitive advantage, customer relationship management, grid computing
Procedia PDF Downloads 312538 Preliminary Study of Water-Oil Separation Process in Three-Phase Separators Using Factorial Experimental Designs and Simulation
Authors: Caroline M. B. De Araujo, Helenise A. Do Nascimento, Claudia J. Da S. Cavalcanti, Mauricio A. Da Motta Sobrinho, Maria F. Pimentel
Abstract:
Oil production is often followed by the joint production of water and gas. During the journey up to the surface, due to severe conditions of temperature and pressure, the mixing between these three components normally occurs. Thus, the three phases separation process must be one of the first steps to be performed after crude oil extraction, where the water-oil separation is the most complex and important step, since the presence of water into the process line can increase corrosion and hydrates formation. A wide range of methods can be applied in order to proceed with oil-water separation, being more commonly used: flotation, hydrocyclones, as well as the three phase separator vessels. Facing what has been presented so far, it is the aim of this paper to study a system consisting of a three-phase separator, evaluating the influence of three variables: temperature, working pressure and separator type, for two types of oil (light and heavy), by performing two factorial design plans 23, in order to find the best operating condition. In this case, the purpose is to obtain the greatest oil flow rate in the product stream (m3/h) as well as the lowest percentage of water in the oil stream. The simulation of the three-phase separator was performed using Aspen Hysys®2006 simulation software in stationary mode, and the evaluation of the factorial experimental designs was performed using the software Statistica®. From the general analysis of the four normal probability plots of effects obtained, it was observed that interaction effects of two and three factors did not show statistical significance at 95% confidence, since all the values were very close to zero. Similarly, the main effect "separator type" did not show significant statistical influence in any situation. As in this case, it has been assumed that the volumetric flow of water, oil and gas were equal in the inlet stream, the effect separator type, in fact, may not be significant for the proposed system. Nevertheless, the main effect “temperature” was significant for both responses (oil flow rate and mass fraction of water in the oil stream), considering both light and heavy oil, so that the best operation condition occurs with the temperature at its lowest level (30oC), since the higher the temperature, the liquid oil components pass into the vapor phase, going to the gas stream. Furthermore, the higher the temperature, the higher the formation water vapor, so that ends up going into the lighter stream (oil stream), making the separation process more difficult. Regarding the “working pressure”, this effect showed to be significant only for the oil flow rate, so that the best operation condition occurs with the pressure at its highest level (9bar), since a higher operating pressure, in this case, indicated a lower pressure drop inside the vessel, generating lower level of turbulence inside the separator. In conclusion, the best-operating condition obtained for the proposed system, at the studied range, occurs for temperature is at its lowest level and the working pressure is at its highest level.Keywords: factorial experimental design, oil production, simulation, three-phase separator
Procedia PDF Downloads 288537 The Seller’s Sense: Buying-Selling Perspective Affects the Sensitivity to Expected-Value Differences
Authors: Taher Abofol, Eldad Yechiam, Thorsten Pachur
Abstract:
In four studies, we examined whether seller and buyers differ not only in subjective price levels for objects (i.e., the endowment effect) but also in their relative accuracy given objects varying in expected value. If, as has been proposed, sellers stand to accrue a more substantial loss than buyers do, then their pricing decisions should be more sensitive to expected-value differences between objects. This is implied by loss aversion due to the steeper slope of prospect theory’s value function for losses than for gains, as well as by loss attention account, which posits that losses increase the attention invested in a task. Both accounts suggest that losses increased sensitivity to relative values of different objects, which should result in better alignment of pricing decisions to the objective value of objects on the part of sellers. Under loss attention, this characteristic should only emerge under certain boundary conditions. In Study 1 a published dataset was reanalyzed, in which 152 participants indicated buying or selling prices for monetary lotteries with different expected values. Relative EV sensitivity was calculated for participants as the Spearman rank correlation between their pricing decisions for each of the lotteries and the lotteries' expected values. An ANOVA revealed a main effect of perspective (sellers versus buyers), F(1,150) = 85.3, p < .0001 with greater EV sensitivity for sellers. Study 2 examined the prediction (implied by loss attention) that the positive effect of losses on performance emerges particularly under conditions of time constraints. A published dataset was reanalyzed, where 84 participants were asked to provide selling and buying prices for monetary lotteries in three deliberations time conditions (5, 10, 15 seconds). As in Study 1, an ANOVA revealed greater EV sensitivity for sellers than for buyers, F(1,82) = 9.34, p = .003. Importantly, there was also an interaction of perspective by deliberation time. Post-hoc tests revealed that there were main effects of perspective both in the condition with 5s deliberation time, and in the condition with 10s deliberation time, but not in the 15s condition. Thus, sellers’ EV-sensitivity advantage disappeared with extended deliberation. Study 3 replicated the design of study 1 but administered the task three times to test if the effect decays with repeated presentation. The results showed that the difference between buyers and sellers’ EV sensitivity was replicated in repeated task presentations. Study 4 examined the loss attention prediction that EV-sensitivity differences can be eliminated by manipulations that reduce the differential attention investment of sellers and buyers. This was carried out by randomly mixing selling and buying trials for each participant. The results revealed no differences in EV sensitivity between selling and buying trials. The pattern of results is consistent with an attentional resource-based account of the differences between sellers and buyers. Thus, asking people to price, an object from a seller's perspective rather than the buyer's improves the relative accuracy of pricing decisions; subtle changes in the framing of one’s perspective in a trading negotiation may improve price accuracy.Keywords: decision making, endowment effect, pricing, loss aversion, loss attention
Procedia PDF Downloads 345536 Antimicrobial and Anti-Biofilm Activity of Non-Thermal Plasma
Authors: Jan Masak, Eva Kvasnickova, Vladimir Scholtz, Olga Matatkova, Marketa Valkova, Alena Cejkova
Abstract:
Microbial colonization of medical instruments, catheters, implants, etc. is a serious problem in the spread of nosocomial infections. Biofilms exhibit enormous resistance to environment. The resistance of biofilm populations to antibiotic or biocides often increases by two to three orders of magnitude in comparison with suspension populations. Subjects of interests are substances or physical processes that primarily cause the destruction of biofilm, while the released cells can be killed by existing antibiotics. In addition, agents that do not have a strong lethal effect do not cause such a significant selection pressure to further enhance resistance. Non-thermal plasma (NTP) is defined as neutral, ionized gas composed of particles (photons, electrons, positive and negative ions, free radicals and excited or non-excited molecules) which are in permanent interaction. In this work, the effect of NTP generated by the cometary corona with a metallic grid on the formation and stability of biofilm and metabolic activity of cells in biofilm was studied. NTP was applied on biofilm populations of Staphylococcus epidermidis DBM 3179, Pseudomonas aeruginosa DBM 3081, DBM 3777, ATCC 15442 and ATCC 10145, Escherichia coli DBM 3125 and Candida albicans DBM 2164 grown on solid media on Petri dishes and on the titanium alloy (Ti6Al4V) surface used for the production joint replacements. Erythromycin (for S. epidermidis), polymyxin B (for E. coli and P. aeruginosa), amphotericin B (for C. albicans) and ceftazidime (for P. aeruginosa) were used to study the combined effect of NTP and antibiotics. Biofilms were quantified by crystal violet assay. Metabolic activity of the cells in biofilm was measured using MTT (3-[4,5-dimethylthiazol-2-yl]-2,5 diphenyl tetrazolium bromide) colorimetric test based on the reduction of MTT into formazan by the dehydrogenase system of living cells. Fluorescence microscopy was applied to visualize the biofilm on the surface of the titanium alloy; SYTO 13 was used as a fluorescence probe to stain cells in the biofilm. It has been shown that biofilm populations of all studied microorganisms are very sensitive to the type of used NTP. The inhibition zone of biofilm recorded after 60 minutes exposure to NTP exceeded 20 cm², except P. aeruginosa DBM 3777 and ATCC 10145, where it was about 9 cm². Also metabolic activity of cells in biofilm differed for individual microbial strains. High sensitivity to NTP was observed in S. epidermidis, in which the metabolic activity of biofilm decreased after 30 minutes of NTP exposure to 15% and after 60 minutes to 1%. Conversely, the metabolic activity of cells of C. albicans decreased to 53% after 30 minutes of NTP exposure. Nevertheless, this result can be considered very good. Suitable combinations of exposure time of NTP and the concentration of antibiotic achieved in most cases a remarkable synergic effect on the reduction of the metabolic activity of the cells of the biofilm. For example, in the case of P. aeruginosa DBM 3777, a combination of 30 minutes of NTP with 1 mg/l of ceftazidime resulted in a decrease metabolic activity below 4%.Keywords: anti-biofilm activity, antibiotic, non-thermal plasma, opportunistic pathogens
Procedia PDF Downloads 184535 Determination of Physical Properties of Crude Oil Distillates by Near-Infrared Spectroscopy and Multivariate Calibration
Authors: Ayten Ekin Meşe, Selahattin Şentürk, Melike Duvanoğlu
Abstract:
Petroleum refineries are a highly complex process industry with continuous production and high operating costs. Physical separation of crude oil starts with the crude oil distillation unit, continues with various conversion and purification units, and passes through many stages until obtaining the final product. To meet the desired product specification, process parameters are strictly followed. To be able to ensure the quality of distillates, routine analyses are performed in quality control laboratories based on appropriate international standards such as American Society for Testing and Materials (ASTM) standard methods and European Standard (EN) methods. The cut point of distillates in the crude distillation unit is very crucial for the efficiency of the upcoming processes. In order to maximize the process efficiency, the determination of the quality of distillates should be as fast as possible, reliable, and cost-effective. In this sense, an alternative study was carried out on the crude oil distillation unit that serves the entire refinery process. In this work, studies were conducted with three different crude oil distillates which are Light Straight Run Naphtha (LSRN), Heavy Straight Run Naphtha (HSRN), and Kerosene. These products are named after separation by the number of carbons it contains. LSRN consists of five to six carbon-containing hydrocarbons, HSRN consist of six to ten, and kerosene consists of sixteen to twenty-two carbon-containing hydrocarbons. Physical properties of three different crude distillation unit products (LSRN, HSRN, and Kerosene) were determined using Near-Infrared Spectroscopy with multivariate calibration. The absorbance spectra of the petroleum samples were obtained in the range from 10000 cm⁻¹ to 4000 cm⁻¹, employing a quartz transmittance flow through cell with a 2 mm light path and a resolution of 2 cm⁻¹. A total of 400 samples were collected for each petroleum sample for almost four years. Several different crude oil grades were processed during sample collection times. Extended Multiplicative Signal Correction (EMSC) and Savitzky-Golay (SG) preprocessing techniques were applied to FT-NIR spectra of samples to eliminate baseline shifts and suppress unwanted variation. Two different multivariate calibration approaches (Partial Least Squares Regression, PLS and Genetic Inverse Least Squares, GILS) and an ensemble model were applied to preprocessed FT-NIR spectra. Predictive performance of each multivariate calibration technique and preprocessing techniques were compared, and the best models were chosen according to the reproducibility of ASTM reference methods. This work demonstrates the developed models can be used for routine analysis instead of conventional analytical methods with over 90% accuracy.Keywords: crude distillation unit, multivariate calibration, near infrared spectroscopy, data preprocessing, refinery
Procedia PDF Downloads 129534 Using Computer Vision and Machine Learning to Improve Facility Design for Healthcare Facility Worker Safety
Authors: Hengameh Hosseini
Abstract:
Design of large healthcare facilities – such as hospitals, multi-service line clinics, and nursing facilities - that can accommodate patients with wide-ranging disabilities is a challenging endeavor and one that is poorly understood among healthcare facility managers, administrators, and executives. An even less-understood extension of this problem is the implications of weakly or insufficiently accommodative design of facilities for healthcare workers in physically-intensive jobs who may also suffer from a range of disabilities and who are therefore at increased risk of workplace accident and injury. Combine this reality with the vast range of facility types, ages, and designs, and the problem of universal accommodation becomes even more daunting and complex. In this study, we focus on the implication of facility design for healthcare workers suffering with low vision who also have physically active jobs. The points of difficulty are myriad and could span health service infrastructure, the equipment used in health facilities, and transport to and from appointments and other services can all pose a barrier to health care if they are inaccessible, less accessible, or even simply less comfortable for people with various disabilities. We conduct a series of surveys and interviews with employees and administrators of 7 facilities of a range of sizes and ownership models in the Northeastern United States and combine that corpus with in-facility observations and data collection to identify five major points of failure common to all the facilities that we concluded could pose safety threats to employees with vision impairments, ranging from very minor to severe. We determine that lack of design empathy is a major commonality among facility management and ownership. We subsequently propose three methods for remedying this lack of empathy-informed design, to remedy the dangers posed to employees: the use of an existing open-sourced Augmented Reality application to simulate the low-vision experience for designers and managers; the use of a machine learning model we develop to automatically infer facility shortcomings from large datasets of recorded patient and employee reviews and feedback; and the use of a computer vision model fine tuned on images of each facility to infer and predict facility features, locations, and workflows, that could again pose meaningful dangers to visually impaired employees of each facility. After conducting a series of real-world comparative experiments with each of these approaches, we conclude that each of these are viable solutions under particular sets of conditions, and finally characterize the range of facility types, workforce composition profiles, and work conditions under which each of these methods would be most apt and successful.Keywords: artificial intelligence, healthcare workers, facility design, disability, visually impaired, workplace safety
Procedia PDF Downloads 116533 Reuse of Historic Buildings for Tourism: Policy Gaps
Authors: Joseph Falzon, Margaret Nelson
Abstract:
Background: Regeneration and re-use of abandoned historic buildings present a continuous challenge for policy makers and stakeholders in the tourism and leisure industry. Obsolete historic buildings provide great potential for tourism and leisure accommodation, presenting unique heritage experiences to travellers and host communities. Contemporary demands in the hospitality industry continuously require higher standards, some of which are in conflict with heritage conservation principles. Objective: The aim of this research paper is to critically discuss regeneration policies with stakeholders of the tourism and leisure industry and to examine current practices in policy development and the resultant impact of policies on the Maltese tourism and leisure industry. Research Design: Six semi-structured interviews with stakeholders involved in the tourism and leisure industry participated in the research. A number of measures were taken to reduce bias and thus improve trustworthiness. Clear statements of the purpose of the research study were provided at the start of each interview to reduce expectancy bias. The interviews were semi-structured to minimise interviewer bias. Interviewees were allowed to expand and elaborate as necessary, with only necessary probing questions, to allow free expression of opinion and practices. Interview guide was submitted to participants at least two weeks before the interview to allow participants to prepare for the interview and prevent recall bias during the interview as much as possible. Interview questions and probes contained both positive and negative aspects to prevent interviewer bias. Policy documents were available during the interview to prevent recall bias. Interview recordings were transcribed ‘intelligent’ verbatim. Analysis was carried out using thematic analysis with the coding frame developed independently by two researchers. All phases of the study were governed by research ethics. Findings: Findings were grouped in main themes: financing of regeneration, governance, legislation and policies. Other key issues included value of historic buildings and approaches for regeneration. Whist regeneration of historic buildings was noted, participants discussed a number of barriers that hindered regeneration. Stakeholders identified gaps in policies and gaps at policy implementation stages. European Union funding policies facilitated regeneration initiatives but funding criteria based on economic deliverables presented the intangible heritage gap. Stakeholders identified niche markets for heritage tourism accommodation. Lack of research-based policies was also identified. Conclusion: Potential of regeneration is hindered by inadequate legal framework that supports contemporary needs of the tourism industry. Policies should be developed by active stakeholder participation. Adequate funding schemes have to support the tangible and intangible components of the built heritage.Keywords: governance, historic buildings, policy, tourism
Procedia PDF Downloads 234532 A Sustainable Training and Feedback Model for Developing the Teaching Capabilities of Sessional Academic Staff
Authors: Nirmani Wijenayake, Louise Lutze-Mann, Lucy Jo, John Wilson, Vivian Yeung, Dean Lovett, Kim Snepvangers
Abstract:
Sessional academic staff at universities have the most influence and impact on student learning, engagement, and experience as they have the most direct contact with undergraduate students. A blended technology-enhanced program was created for the development and support of sessional staff to ensure adequate training is provided to deliver quality educational outcomes for the students. This program combines innovative mixed media educational modules, a peer-driven support forum, and face-to-face workshops to provide a comprehensive training and support package for staff. Additionally, the program encourages the development of learning communities and peer mentoring among the sessional staff to enhance their support system. In 2018, the program was piloted on 100 sessional staff in the School of Biotechnology and Biomolecular Sciences to evaluate the effectiveness of this model. As part of the program, rotoscope animations were developed to showcase ‘typical’ interactions between staff and students. These were designed around communication, confidence building, consistency in grading, feedback, diversity awareness, and mental health and wellbeing. When surveyed, 86% of sessional staff found these animations to be helpful in their teaching. An online platform (Moodle) was set up to disseminate educational resources and teaching tips, to host a discussion forum for peer-to-peer communication and to increase critical thinking and problem-solving skills through scenario-based lessons. The learning analytics from these lessons were essential in identifying difficulties faced by sessional staff to further develop supporting workshops to improve outcomes related to teaching. The face-to-face professional development workshops were run by expert guest speakers on topics such as cultural diversity, stress and anxiety, LGBTIQ and student engagement. All the attendees of the workshops found them to be useful and 88% said they felt these workshops increase interaction with their peers and built a sense of community. The final component of the program was to use an adaptive e-learning platform to gather feedback from the students on sessional staff teaching twice during the semester. The initial feedback provides sessional staff with enough time to reflect on their teaching and adjust their performance if necessary, to improve the student experience. The feedback from students and the sessional staff on this model has been extremely positive. The training equips the sessional staff with knowledge and insights which can provide students with an exceptional learning environment. This program is designed in a flexible and scalable manner so that other faculties or institutions could adapt components for their own training. It is anticipated that the training and support would help to build the next generation of educators who will directly impact the educational experience of students.Keywords: designing effective instruction, enhancing student learning, implementing effective strategies, professional development
Procedia PDF Downloads 128531 Thermosensitive Hydrogel Development for Its Possible Application in Cardiac Cell Therapy
Authors: Lina Paola Orozco Marin, Yuliet Montoya Osorio, John Bustamante Osorno
Abstract:
Ischemic events can culminate in acute myocardial infarction by irreversible cardiac lesions that cannot be restored due to the limited regenerative capacity of the heart. Cell therapy seeks to replace these injured or necrotic cells by transplanting healthy and functional cells. The therapeutic alternatives proposed by tissue engineering and cardiovascular regenerative medicine are the use of biomaterials to mimic the native extracellular medium, which is full of proteins, proteoglycans, and glycoproteins. The selected biomaterials must provide structural support to the encapsulated cells to avoid their migration and death in the host tissue. In this context, the present research work focused on developing a natural thermosensitive hydrogel, its physical and chemical characterization, and the determination of its biocompatibility in vitro. The hydrogel was developed by mixing hydrolyzed bovine and porcine collagen at 2% w/v, chitosan at 2.5% w/v, and beta-glycerolphosphate at 8.5% w/w and 10.5% w/w in magnetic stirring at 4°C. Once obtained, the thermosensitivity and gelation time were determined, incubating the samples at 37°C and evaluating them through the inverted tube method. The morphological characterization of the hydrogels was carried out through scanning electron microscopy. Chemical characterization was carried out employing infrared spectroscopy. The biocompatibility was determined using the MTT cytotoxicity test according to the ISO 10993-5 standard for the hydrogel’s precursors using the fetal human ventricular cardiomyocytes cell line RL-14. The RL-14 cells were also seeded on the top of the hydrogels, and the supernatants were subculture at different periods to their observation under a bright field microscope. Four types of thermosensitive hydrogels were obtained, which differ in their composition and concentration, called A1 (chitosan/bovine collagen/beta-glycerolphosphate 8.5%w/w), A2 (chitosan/porcine collagen/beta-glycerolphosphate 8.5%), B1 (chitosan/bovine collagen/beta-glycerolphosphate 10.5%) and B2 (chitosan/porcine collagen/beta-glycerolphosphate 10.5%). A1 and A2 had a gelation time of 40 minutes, and B1 and B2 had a gelation time of 30 minutes at 37°C. Electron micrographs revealed a three-dimensional internal structure with interconnected pores for the four types of hydrogels. This facilitates the exchange of nutrients, oxygen, and the exit of metabolites, allowing to preserve a microenvironment suitable for cell proliferation. In the infrared spectra, it was possible to observe the interaction that occurs between the amides of polymeric compounds with the phosphate groups of beta-glycerolphosphate. Finally, the biocompatibility tests indicated that cells in contact with the hydrogel or with each of its precursors are not affected in their proliferation capacity for a period of 16 days. These results show the potential of the hydrogel to increase the cell survival rate in the cardiac cell therapies under investigation. Moreover, the results lay the foundations for its characterization and biological evaluation in both in vitro and in vivo models.Keywords: cardiac cell therapy, cardiac ischemia, natural polymers, thermosensitive hydrogel
Procedia PDF Downloads 191530 Comparative Analysis of Mechanical Properties of Paddy Rice for Different Variety-Moisture Content Interactions
Authors: Johnson Opoku-Asante, Emmanuel Bobobee, Joseph Akowuah, Eric Amoah Asante
Abstract:
In recent years, the issue of postharvest losses has become a serious concern in Sub-Saharan Africa. Postharvest technology development and adaptation need urgent attention, particularly for small and medium-scale rice farmers in Africa. However, to better develop any postharvest technology, knowledge of the mechanical properties of different varieties of paddy rice is vital. There is also the issue of the development of new rice cultivars. The objectives of this research are to (1) determine the mechanical properties of the selected paddy rice varieties at varying moisture content. (2) conduct a comparative analysis of the mechanical properties of selected rice paddy for different variety-moisture content interactions. (3) determine the significant statistical differences between the mean values of the various variety-moisture content interactions The mechanical properties of AGRA rice, CRI-Amankwatia, CRI-Enapa and CRI-Dartey, four local varieties developed by Crop Research Institute of Ghana are compared at 11.5%, 13.0% and 16.5% dry basis moisture content. The mechanical properties measured are Sphericity, Aspect ratio, Grain mass, 1000 Grain mass, Bulk Density, True Density, Porosity and Angle of Repose. Samples were collected from the Kwadaso Agric College of the CRI in Kumasi. The samples were threshed manually and winnowed before conducting the experiment. The moisture content was determined on a dry basis using the Moistex Screw-Type Digital Grain Moisture Meter. Other equipment used for data collection were venire calipers and Citizen electronic scale. A 4×3 factorial arrangement was used in a completely randomized design in three replications. Tukey's HSD comparisons test was conducted during data analysis to compare all possible pairwise combinations of the various varieties’ moisture content interaction. From the results, it was concluded that Sphericity recorded 0.391 mm³ to 0.377 mm³ for CRI-Dartey at 16.5% and CRI-Enapa at 13.5%, respectively, whereas Aspect Ratio recorded 0.298 mm³ to 0.269 mm³ for CRI-Dartey at 16.5% and CRI-Enapa at 13.5% respectively. For grain mass, AGRA rice at 13.0% also recorded 0.0312 g as the highest score and CRI-Enapa at 13.0% obtained 0.0237 as the lowest score. For the GM1000, it was observed that it ranges from 29.33 g for CRI-Amankwatia at 16.5% moisture content to 22.54 g for CRI-Enapa at 16.5% interactions. Bulk Density ranged from 654.0 kg/m³ to 422.9 kg/m³ for CRI-Amankwatia at 16.5% and CRI-Enapa at 11.5% as the highest and lowest recordings, respectively. It was also observed that the true Density ranges from 1685.8 kg/m3 for AGRA rice at 13.0% moisture content to 1352.5 kg/m³ for CRI-Enapa at 16.5% interactions. In the case of porosity, CRI-Enapa at 11.5% received the highest score of 70.83% and CRI-Amankwatia at 16.5 received the lowest score of 55.88%. Finally, in the case of Angle of Repose, CRI-Amankwatia at 16.5% recorded the highest score of 47.3o and CRI-Enapa at 11.5% recorded the least score of 34.27o. In all cases, the difference in mean value was less than the LSD. This indicates that there were no significant statistical differences between their mean values, indicating that technologies developed and adapted for one variety can equally be used for all the other varieties.Keywords: angle of repose, aspect ratio, bulk density, porosity, sphericity, mechanical properties
Procedia PDF Downloads 99