Search results for: design knowledge
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18234

Search results for: design knowledge

1944 A Meta-Analysis towards an Integrated Framework for Sustainable Urban Transportation within the Concept of Sustainable Cities

Authors: Hande Aladağ, Gökçe Aydın

Abstract:

The world’s population is increasing continuously and rapidly. Moreover, there are other problems such as the decline of natural energy resources, global warming, and environmental pollution. These facts have made sustainability an important and primary topic from future planning perspective. From this perspective, constituting sustainable cities and communities can be considered as one of the key issues in terms of sustainable development goals. The concept of sustainable cities can be evaluated under three headings such as green/sustainable buildings, self – contained cities and sustainable transportation. This study only concentrates on how to form and support a sustainable urban transportation system to contribute to the sustainable urbanization. Urban transportation system inevitably requires many engineering projects with various sizes. Engineering projects generally have four phases, in the following order: Planning, design, construction, operation. The order is valid but there are feedbacks from every phase to every phase in its upstream. In this regard, engineering projects are iterative processes. Sustainability is an integrated and comprehensive concept thus it should be among the primary concerns in every phase of transportation projects. In the study, a meta-analysis will be performed on the related studies in the literature. It is targeted and planned that, as a result of the findings of this meta-analysis, a framework for the list of principles and actions for sustainable transport will be formed. The meta-analysis will be performed to point out and clarify sustainability approaches in every phase of the related engineering projects, with also paying attention to the iterative nature of the process and relative contribution of the action for the outcomes of the sustainable transportation system. However, the analysis will not be limited to the engineering projects, non-engineering solutions will also be included in the meta-analysis. The most important contribution of this study is a determination of the outcomes of a sustainable urban transportation system in terms of energy efficiency, resource preservation and related social, environmental and economic factors. The study is also important because it will give light to the engineering and management approaches to achieve these outcomes.

Keywords: meta-analysis, sustainability, sustainable cities, sustainable urban transportation, urban transportation

Procedia PDF Downloads 315
1943 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology

Authors: Amarendar Reddy Addula

Abstract:

Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.

Keywords: artificial intelligence, ethics & human rights issues, laws, international laws

Procedia PDF Downloads 80
1942 The Social Aspects of Mental Illness among Orthodox Christians of the Tigrinya Ethnic Group in Eritrea

Authors: Erimias Firre

Abstract:

This study is situated within the religio-cultural milieu of Coptic Orthodox Christians of the Tigrinya ethnic group in Eritrea. With this ethnic group being conservative and traditionally bound, extended family structures dissected along various clans and expansive community networks are the distinguishing mark of its members. Notably, Coptic Tigrinya constitutes the largest percentage of all Christian denominations in Eritrea. As religious, cultural beliefs, rituals and teachings permeate in all aspects of social life, a distinct worldview and traditionalized health and illness conceptualization are common. Accordingly, this study argues that religio-culturally bound illness ideologies immensely determine the perception, help seeking behavior and healing preference of Coptic Tigrinya in Eritrea. The study bears significance in the sense that it bridges an important knowledge gap, given that it is ethno-linguistically (within the Tigrinya ethnic group), spatially (central region of Eritrea) and religiously (Coptic Christianity) specific. The conceptual framework guiding this research centered on the social determinants of mental health, and explores through the lens of critical theory how existing systems generate social vulnerability and structural inequality, providing a platform to reveal how the psychosocial model has the capacity to emancipate and empower those with mental disorders to live productive and meaningful lives. A case study approach was employed to explore the interrelationship between religio-cultural beliefs and practices and perception of common mental disorders of depression, anxiety, bipolar affective, schizophrenia and post-traumatic stress disorders and the impact of these perceptions on people with those mental disorders. Purposive sampling was used to recruit 41 participants representing seven diverse cohorts; people with common mental disorders, family caregivers, general community members, ex-fighters , priests, staff at St. Mary’s and Biet-Mekae Community Health Center; resulting in rich data for thematic analysis. Findings highlighted current religio-cultural perceptions, causes and treatment of mental disorders among Coptic Tigrinya result in widespread labelling, stigma and discrimination, both of those with mental disorders and their families. Traditional healing sources are almost exclusively tried, sometimes for many years, before families and sufferers seek formal medical assessment and treatment, resulting difficult to treat illness chronicity. Service gaps in the formal medical system result in the inability to meet the principles enshrined in the WHO Mental Health Action Plan 2013-2020 to which the Eritrean Government is a signatory. However, the study found that across all participant cohorts, there was a desire for change that will create a culture whereby those with mental disorders will have restored hope, connectedness, healing and self-determination.

Keywords: Coptic Tigrinya, mental disorders, psychosocial model social integration and recovery, traditional healing

Procedia PDF Downloads 168
1941 Electron Beam Melting Process Parameter Optimization Using Multi Objective Reinforcement Learning

Authors: Michael A. Sprayberry, Vincent C. Paquit

Abstract:

Process parameter optimization in metal powder bed electron beam melting (MPBEBM) is crucial to ensure the technology's repeatability, control, and industry-continued adoption. Despite continued efforts to address the challenges via the traditional design of experiments and process mapping techniques, there needs to be more successful in an on-the-fly optimization framework that can be adapted to MPBEBM systems. Additionally, data-intensive physics-based modeling and simulation methods are difficult to support by a metal AM alloy or system due to cost restrictions. To mitigate the challenge of resource-intensive experiments and models, this paper introduces a Multi-Objective Reinforcement Learning (MORL) methodology defined as an optimization problem for MPBEBM. An off-policy MORL framework based on policy gradient is proposed to discover optimal sets of beam power (P) – beam velocity (v) combinations to maintain a steady-state melt pool depth and phase transformation. For this, an experimentally validated Eagar-Tsai melt pool model is used to simulate the MPBEBM environment, where the beam acts as the agent across the P – v space to maximize returns for the uncertain powder bed environment producing a melt pool and phase transformation closer to the optimum. The culmination of the training process yields a set of process parameters {power, speed, hatch spacing, layer depth, and preheat} where the state (P,v) with the highest returns corresponds to a refined process parameter mapping. The resultant objects and mapping of returns to the P-v space show convergence with experimental observations. The framework, therefore, provides a model-free multi-objective approach to discovery without the need for trial-and-error experiments.

Keywords: additive manufacturing, metal powder bed fusion, reinforcement learning, process parameter optimization

Procedia PDF Downloads 74
1940 A Critique of the Neo-Liberal Model of Economic Governance and Its Application to the Electricity Market Industry: Some Lessons and Learning Points from Nigeria

Authors: Kabiru Adamu

Abstract:

The Nigerian electricity industry was deregulated and privatized in 2005 and 2014 in line with global trend and practice. International and multilateral lending institutions advised developing countries, Nigeria inclusive, to adopt deregulation and privatization as part of reforms in their electricity sectors. The ideological basis of these reforms are traceable to neoliberalism. Neoliberalism is an ideology that believes in the supremacy of free market and strong non-interventionist competition law as against government ownership of the electricity market. This ideology became a state practice and a blue print for the deregulation and privatization of the electricity markets in many parts of the world. The blue print was used as a template for the privatization of the Nigerian electricity industry. In this wise, this paper, using documentary analysis and review of academic literatures, examines neoliberalism as an ideology and model of economic governance for the electricity supply industry in Nigeria. The paper examines the origin of the ideology, it features and principles and how it was used as the blue print in designing policies for electricity reforms in both developed and developing countries. The paper found out that there is gap between the ideology in theory and in practice because although the theory is rational in thinking it is difficult to be implemented in practice. The paper argues that the ideology has a mismatched effect and this has made its application in the electricity industry in many developing countries problematic and unsuccessful. In the case of Nigeria, the article argues that the template is also not working. The article concludes that the electricity sectors in Nigeria have failed to develop into competitive market for the benefit of consumers in line with the assumptions and promises of the ideology. The paper therefore recommends the democratization of the electricity sectors in Nigeria through a new system of public ownership as the solution to the failure of the neoliberal policies; but this requires the design of a more democratic and participatory system of ownership with communities and state governments in charge of the administration, running and operation of the sector.

Keywords: electricity, energy governance, neo-liberalism, regulation

Procedia PDF Downloads 148
1939 Evaluation of Coagulation Efficiency of Protein Extracts from Lupinus Albus L., Moringa Stenopetala Cufod., Trigonella Foenum-Graecum L. And Vicia Faba L. For Water Purification

Authors: Neway Adele, Adey Feleke

Abstract:

Access to clean drinking water is a basic human right. However, an estimated 1.2 billion people across the world consume unclean water daily. Interest has been growing in natural coagulants as the health and environmental concerns of conventional chemical coagulants are rising. Natural coagulants have the potential to serve as alternative water treatment agents. In this study, Lupinus albus, Moringa stenopetala, Trigonella foenum-graecum and Vicia faba protein extracts were evaluated as natural coagulants for water treatment. The protein extracts were purified from crude extracts using a protein purifier, and protein concentrations were determined by the spectrophotometric method. Small-volume coagulation efficiency tests were conducted on raw water taken from the Legedadi water treatment plant. These were done using a completely randomized design (CRD) experiment with settling times of 0 min (initial time), 90 min, 180 min and 270 min and protein extract doses of 5 mg/L, 10 mg/L, 15 mg/L and 20 mg/L. Raw water as negative control and polyelectrolyte as positive control were also included. The optical density (OD) values were measured for all the samples. At 270 min and 20 mg/L, the coagulation efficiency percentages for Lupinus albus, Moringa stenopetala, Trigonella foenum-graecum and Vicia faba protein extracts were 71%, 89%, 12% and 67% in the water sample collected in April 2019 respectively. Similarly, Lupinus albus, Moringa stenopetala and Vicia faba achieved 17%, 92% and 12% at 270 min settling times and 5 mg/L, 20 mg/L and 10 mg/L concentration in the water sample collected from August 2019, respectively. Negative control (raw water) and polyelectrolyte (positive control) were also 6 − 10% and 89 − 94% at 270 min settling time in April and August 2019, respectively. Among the four protein extracts, Moringa stenopetala showed the highest coagulation efficiency, similar to polyelectrolyte. This study concluded that Moringa stenopetala protein extract could be used as a natural coagulant for water purification in both sampling times.

Keywords: coagulation efficiency, extraction, natural coagulant, protein extract

Procedia PDF Downloads 47
1938 Translanguaging as a Decolonial Move in South African Bilingual Classrooms

Authors: Malephole Philomena Sefotho

Abstract:

Nowadays, it is a fact that the majority of people, worldwide, are bilingual rather than monolingual due to the surge of globalisation and mobility. Consequently, bilingual education is a topical issue of discussion among researchers. Several studies that have focussed on it have highlighted the importance and need for incorporating learners’ linguistic repertoires in multilingual classrooms and move away from the colonial approach which is a monolingual bias – one language at a time. Researchers pointed out that a systematic approach that involves the concurrent use of languages and not a separation of languages must be implemented in bilingual classroom settings. Translanguaging emerged as a systematic approach that assists learners to make meaning of their world and it involves allowing learners to utilize all their linguistic resources in their classrooms. The South African language policy also room for diverse languages use in bi/multilingual classrooms. This study, therefore, sought to explore how teachers apply translanguaging in bilingual classrooms in incorporating learners’ linguistic repertoires. It further establishes teachers’ perspectives in the use of more than one language in teaching and learning. The participants for this study were language teachers who teach at bilingual primary schools in Johannesburg in South Africa. Semi-structured interviews were conducted to establish their perceptions on the concurrent use of languages. Qualitative research design was followed in analysing data. The findings showed that teachers were reluctant to allow translanguaging to take place in their classrooms even though they realise the importance thereof. Not allowing bilingual learners to use their linguistic repertoires has resulted in learners’ negative attitude towards their languages and contributed in learners’ loss of their identity. This article, thus recommends a drastic change to decolonised approaches in teaching and learning in multilingual settings and translanguaging as a decolonial move where learners are allowed to translanguage freely in their classroom settings for better comprehension and making meaning of concepts and/or related ideas. It further proposes continuous conversations be encouraged to bring eminent cultural and linguistic genocide to a halt.

Keywords: bilingualism, decolonisation, linguistic repertoires, translanguaging

Procedia PDF Downloads 159
1937 Brain-Computer Interfaces That Use Electroencephalography

Authors: Arda Ozkurt, Ozlem Bozkurt

Abstract:

Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.

Keywords: BCI, EEG, non-invasive, spatial resolution

Procedia PDF Downloads 57
1936 Coffee Consumption and Glucose Metabolism: a Systematic Review of Clinical Trials

Authors: Caio E. G. Reis, Jose G. Dórea, Teresa H. M. da Costa

Abstract:

Objective: Epidemiological data shows an inverse association of coffee consumption with risk of type 2 diabetes mellitus. However, the clinical effects of coffee consumption on the glucose metabolism biomarkers remain controversial. Thus, this paper reviews clinical trials that evaluated the effects of coffee consumption on glucose metabolism. Research Design and Methods: We identified studies published until December 2014 by searching electronic databases and reference lists. We included randomized clinical trials which the intervention group received caffeinated and/or decaffeinated coffee and the control group received water or placebo treatments and measured biomarkers of glucose metabolism. The Jadad Score was applied to evaluate the quality of the studies whereas studies that scored ≥ 3 points were considered for the analyses. Results: Seven clinical trials (total of 237 subjects) were analyzed involving adult healthy, overweight and diabetic subjects. The studies were divided in short-term (1 to 3h) and long-term (2 to 16 weeks) duration. The results for short-term studies showed that caffeinated coffee consumption may increase the area under the curve for glucose response, while for long-term studies caffeinated coffee may improve the glycemic metabolism by reducing the glucose curve and increasing insulin response. These results seem to show that the benefits of coffee consumption occur in the long-term as has been shown in the reduction of type 2 diabetes mellitus risk in epidemiological studies. Nevertheless, until the relationship between long-term coffee consumption and type 2 diabetes mellitus is better understood and any mechanism involved identified, it is premature to make claims about coffee preventing type 2 diabetes mellitus. Conclusion: The findings suggest that caffeinated coffee may impairs glucose metabolism in short-term but in the long-term the studies indicate reduction of type 2 diabetes mellitus risk. More clinical trials with comparable methodology are needed to unravel this paradox.

Keywords: coffee, diabetes mellitus type 2, glucose, insulin

Procedia PDF Downloads 447
1935 Numerical Methodology to Support the Development of a Double Chamber Syringe

Authors: Lourenço Bastos, Filipa Carneiro, Bruno Vale, Rita Marques Joana Silva, Ricardo Freitas, Ângelo Marques, Sara Cortez, Alberta Coelho, Pedro Parreira, Liliana Sousa, Anabela Salgueiro, Bruno Silva

Abstract:

The process of flushing is considered to be an adequate technique to reduce the risk of infection during the clinical practice of venous catheterization. Nonetheless, there is still a lack of adhesion to this method, in part due to the complexity of this procedure. The project SeringaDuo aimed to develop an innovative double-chamber syringe for intravenous sequential administration of drugs and serums. This device served the purpose of improving the adherence to the practice, through the reduction of manipulations needed, which also improves patient safety, and though the promotion of flushing practice by health professionals, by simplifying this task. To assist on the development of this innovative syringe, a numerical methodology was developed and validated in order to predict the syringe’s mechanical and flow behavior during the fluids’ loading and administration phases, as well as to allow the material behavior evaluation during its production. For this, three commercial numerical simulation software was used, namely ABAQUS, ANSYS/FLUENT, and MOLDFLOW. This methodology aimed to evaluate the concepts feasibility and to optimize the geometries of the syringe’s components, creating this way an iterative process for product development based on numerical simulations, validated by the production of prototypes. Through this methodology, it was possible to achieve a final design that fulfils all the characteristics and specifications defined. This iterative process based on numerical simulations is a powerful tool for product development that allows obtaining fast and accurate results without the strict need for prototypes. An iterative process can be implemented, consisting of consecutive constructions and evaluations of new concepts, to obtain an optimized solution, which fulfils all the predefined specifications and requirements.

Keywords: Venous catheterization, flushing, syringe, numerical simulation

Procedia PDF Downloads 147
1934 Building a Parametric Link between Mapping and Planning: A Sunlight-Adaptive Urban Green System Plan Formation Process

Authors: Chenhao Zhu

Abstract:

Quantitative mapping is playing a growing role in guiding urban planning, such as using a heat map created by CFX, CFD2000, or Envi-met, to adjust the master plan. However, there is no effective quantitative link between the mappings and planning formation. So, in many cases, the decision-making is still based on the planner's subjective interpretation and understanding of these mappings, which limits the improvement of scientific and accuracy brought by the quantitative mapping. Therefore, in this paper, an effort has been made to give a methodology of building a parametric link between the mapping and planning formation. A parametric planning process based on radiant mapping has been proposed for creating an urban green system. In the first step, a script is written in Grasshopper to build a road network and form the block, while the Ladybug Plug-in is used to conduct a radiant analysis in the form of mapping. Then, the research creatively transforms the radiant mapping from a polygon into a data point matrix, because polygon is hard to engage in the design formation. Next, another script is created to select the main green spaces from the road network based on the criteria of radiant intensity and connect the green spaces' central points to generate a green corridor. After that, a control parameter is introduced to adjust the corridor's form based on the radiant intensity. Finally, a green system containing greenspace and green corridor is generated under the quantitative control of the data matrix. The designer only needs to modify the control parameter according to the relevant research results and actual conditions to realize the optimization of the green system. This method can also be applied to much other mapping-based analysis, such as wind environment analysis, thermal environment analysis, and even environmental sensitivity analysis. The parameterized link between the mapping and planning will bring about a more accurate, objective, and scientific planning.

Keywords: parametric link, mapping, urban green system, radiant intensity, planning strategy, grasshopper

Procedia PDF Downloads 126
1933 Liposome Loaded Polysaccharide Based Hydrogels: Promising Delayed Release Biomaterials

Authors: J. Desbrieres, M. Popa, C. Peptu, S. Bacaita

Abstract:

Because of their favorable properties (non-toxicity, biodegradability, mucoadhesivity etc.), polysaccharides were studied as biomaterials and as pharmaceutical excipients in drug formulations. These formulations may be produced in a wide variety of forms including hydrogels, hydrogel based particles (or capsules), films etc. In these formulations, the polysaccharide based materials are able to provide local delivery of loaded therapeutic agents but their delivery can be rapid and not easily time-controllable due to, particularly, the burst effect. This leads to a loss in drug efficiency and lifetime. To overcome the consequences of burst effect, systems involving liposomes incorporated into polysaccharide hydrogels may appear as a promising material in tissue engineering, regenerative medicine and drug loading systems. Liposomes are spherical self-closed structures, composed of curved lipid bilayers, which enclose part of the surrounding solvent into their structure. The simplicity of production, their biocompatibility, the size and similar composition of cells, the possibility of size adjustment for specific applications, the ability of hydrophilic or/and hydrophobic drug loading make them a revolutionary tool in nanomedicine and biomedical domain. Drug delivery systems were developed as hydrogels containing chitosan or carboxymethylcellulose (CMC) as polysaccharides and gelatin (GEL) as polypeptide, and phosphatidylcholine or phosphatidylcholine/cholesterol liposomes able to accurately control this delivery, without any burst effect. Hydrogels based on CMC were covalently crosslinked using glutaraldehyde, whereas chitosan based hydrogels were double crosslinked (ionically using sodium tripolyphosphate or sodium sulphate and covalently using glutaraldehyde). It has been proven that the liposome integrity is highly protected during the crosslinking procedure for the formation of the film network. Calcein was used as model active matter for delivery experiments. Multi-Lamellar vesicles (MLV) and Small Uni-Lamellar Vesicles (SUV) were prepared and compared. The liposomes are well distributed throughout the whole area of the film, and the vesicle distribution is equivalent (for both types of liposomes evaluated) on the film surface as well as deeper (100 microns) in the film matrix. An obvious decrease of the burst effect was observed in presence of liposomes as well as a uniform increase of calcein release that continues even at large time scales. Liposomes act as an extra barrier for calcein release. Systems containing MLVs release higher amounts of calcein compared to systems containing SUVs, although these liposomes are more stable in the matrix and diffuse with difficulty. This difference comes from the higher quantity of calcein present within the MLV in relation with their size. Modeling of release kinetics curves was performed and the release of hydrophilic drugs may be described by a multi-scale mechanism characterized by four distinct phases, each of them being characterized by a different kinetics model (Higuchi equation, Korsmeyer-Peppas model etc.). Knowledge of such models will be a very interesting tool for designing new formulations for tissue engineering, regenerative medicine and drug delivery systems.

Keywords: controlled and delayed release, hydrogels, liposomes, polysaccharides

Procedia PDF Downloads 209
1932 Hybrid Reusable Launch Vehicle for Space Application A Naval Approach

Authors: Rajasekar Elangopandian, Anand Shanmugam

Abstract:

In order to reduce the cost of launching satellite and payloads to the orbit this project envisages some immense combined technology. This new technology in space odyssey contains literally four concepts. The first mode in this innovation is flight mission characteristics which, says how the mission will induct. The conventional technique of magnetic levitation will help us to produce the initial thrust. The name states reusable launch vehicle shows its viability of reuseness. The flight consists miniature rocket which produces the required thrust and the two JATO (jet assisted takeoff) boosters which gives the initial boost for the vehicle. The vehicle ostensibly looks like an airplane design and will be located on the super conducting rail track. When the high power electric current given to the rail track, the vehicle starts floating as per the principle of magnetic levitation. If the flight reaches the particular takeoff distance the two boosters gets starts and will give the 48KN thrust each. Obviously it`ll follow the vertical path up to the atmosphere end/start to space. As soon as it gets its speed the two boosters will cutoff. Once it reaches the space the inbuilt spacecraft keep the satellite in the desired orbit. When the work finishes, the apogee motors gives the initial kick to the vehicle to come in to the earth’s atmosphere with 22N thrust and automatically comes to the ground by following the free fall, the help of gravitational force. After the flying region it makes the spiral flight mode then gets landing where the super conducting levitated rail track located. It will catch up the vehicle and keep it by changing the poles of magnets and varying the current. Initial cost for making this vehicle might be high but for the frequent usage this will reduce the launch cost exactly half than the now-a-days technology. The incorporation of such a mechanism gives `hybrid` and the reusability gives `reusable launch vehicle` and ultimately Hybrid reusable launch vehicle.

Keywords: the two JATO (jet assisted takeoff) boosters, magnetic levitation, 48KN thrust each, 22N thrust and automatically comes to the ground

Procedia PDF Downloads 413
1931 The Functional Roles of Right Dorsolateral Prefrontal Cortex and Ventromedial Prefrontal Cortex in Risk-Taking Behavior

Authors: Aline M. Dantas, Alexander T. Sack, Elisabeth Bruggen, Peiran Jiao, Teresa Schuhmann

Abstract:

Risk-taking behavior has been associated with the activity of specific prefrontal regions of the brain, namely the right dorsolateral prefrontal cortex (DLPFC) and the ventromedial prefrontal cortex (VMPFC). While the deactivation of the rDLPFC has been shown to lead to increased risk-taking behavior, the functional relationship between VMPFC activity and risk-taking behavior is yet to be clarified. Correlational evidence suggests that the VMPFC is involved in valuation processes that involve risky choices, but evidence on the functional relationship is lacking. Therefore, this study uses brain stimulation to investigate the role of the VMPFC during risk-taking behavior and replicate the current findings regarding the role of the rDLPFC in this same phenomenon. We used continuous theta-burst stimulation (cTBS) to inhibit either the VMPFC or DLPFC during the execution of the computerized Maastricht Gambling Task (MGT) in a within-subject design with 30 participants. We analyzed the effects of such stimulation on risk-taking behavior, participants’ choices of probabilities and average values, and response time. We hypothesized that, compared to sham stimulation, VMPFC inhibition leads to a reduction in risk-taking behavior by reducing the appeal to higher-value options and, consequently, the attractiveness of riskier options. Right DLPFC (rDLPFC) inhibition, on the other hand, should lead to an increase in risk-taking due to a reduction in cognitive control, confirming existent findings. Stimulation of both the rDLPFC and the VMPFC led to an increase in risk-taking behavior and an increase in the average value chosen after both rDLPFC and VMPFC stimulation compared to sham. No significant effect on chosen probabilities was found. A significant increase in response time was observed exclusively after rDLPFC stimulation. Our results indicate that inhibiting DLPFC and VMPFC separately leads to similar effects, increasing both risk-taking behavior and average value choices, which is likely due to the strong anatomical and functional interconnection of the VMPFC and rDLPFC.

Keywords: decision-making, risk-taking behavior, brain stimulation, TMS

Procedia PDF Downloads 93
1930 Varying Frequency Application of Vermicast as Supplemented with 19-19-19+Me in the Agronomic Performance of Lettuce (Lactuca sativa)

Authors: Jesryl B. Paulite, Eixer Niel V. Enesco

Abstract:

Lettuce is not well known in the lowland locality in the tropical countries like Philippines. Farmers thought that this crop is not adaptable to the climate that we have in lowland. But some new varieties can tolerate warmer conditions. The massive use of pesticides in lettuce production might chronically affect human health and environment. The move of the Philippine government is toward organic. One of the organic material is vermicompost. It is an organic fertilizer that serves as soil conditioner and enhances soil fertility and promotes vigorous and healthy crop growth and Supplementation of 19-19-19+M.E. will make it better since it contains N-P-K and selected microelements to meet the nutritive requirements of the crop. The experiment was conducted at Purok 3, Brgy. Tiburcia, Kapalong, Davao del Norte from February 6, 2014 to March 4, 2014. The study was conducted to determine the effect of varying frequency application of vermicast as supplemented with 19-19-19+M.E. in lettuce. Specifically, this aimed to 1.) Identify the agronomic performance of lettuce as affected by varying frequency application of vermicast as supplemented with 19-19-19+M.E.; 2.) Assess the economic profitability of lettuce as applied with vermicast as supplemented with 19-19-19+M.E. The study was laid out in Randomized Complete Block Design (RCBD) with four treatments and three replications. The treatments were as follow: T1 – Untreated, T2 - Weekly Application, T3- Bi-weekly Application, and T4- Monthly Application. The data on percent (%) mortality were transformed using square root of transformation before Analysis of Variance (ANOVA). Results revealed not significant in terms of percent mortality in weekly and monthly application of the treatment having a mean of 1.76 % and 3.09 %. However, Significant differences were observed in agronomic performances such as; plant height with a mean of 10.63 cm in weekly application and 6.40 cm for the untreated, leaf width with a mean of 10.80 cm for the weekly application and 6.03 for the untreated, fresh weight with a mean of 25.67 g for the weekly application and 6.83 g for the untreated, and yield with a mean of 1,208.33 kg/ha for the weekly application and 327.08 kg/ha for the untreated, respectively. Results further exposed that profitability of lettuce in terms of Return of Production Cost (RPC) were; bi-weekly with 91.01 %, monthly with 68.20 %, weekly with 25.34 % and untreated (control) with 16.69 %.

Keywords: agronomic performance, economic profitability, vermicast, percent mortality, 19-19-19+ME

Procedia PDF Downloads 426
1929 Design and Development of an Innovative MR Damper Based on Intelligent Active Suspension Control of a Malaysia's Model Vehicle

Authors: L. Wei Sheng, M. T. Noor Syazwanee, C. J. Carolyna, M. Amiruddin, M. Pauziah

Abstract:

This paper exhibits the alternatives towards active suspension systems revised based on the classical passive suspension system to improve comfort and handling performance. An active Magneto rheological (MR) suspension system is proposed as to explore the active based suspension system to enhance performance given its freedom to independently specify the characteristics of load carrying, handling, and ride quality. Malaysian quarter car with two degrees of freedom (2DOF) system is designed and constructed to simulate the actions of an active vehicle suspension system. The structure of a conventional twin-tube shock absorber is modified both internally and externally to comprehend with the active suspension system. The shock absorber peripheral structure is altered to enable the assembling and disassembling of the damper through a non-permanent joint whereby the stress analysis of the designed joint is simulated using Finite Element Analysis. Simulation on the internal part where an electrified copper coil of 24AWG is winded is done using Finite Element Method Magnetics to measure the magnetic flux density inside the MR damper. The primary purpose of this approach is to reduce the vibration transmitted from the effects of road surface irregularities while maintaining solid manoeuvrability. The aim of this research is to develop an intelligent control system of a consecutive damping automotive suspension system. The ride quality is improved by means of the reduction of the vertical body acceleration caused by the car body when it experiences disturbances from speed bump and random road roughness. Findings from this research are expected to enhance the quality of ride which in return can prevent the deteriorating effect of vibration on the vehicle condition as well as the passengers’ well-being.

Keywords: active suspension, FEA, magneto rheological damper, Malaysian quarter car model, vibration control

Procedia PDF Downloads 196
1928 Dietary Diversity Practice and Associated Facrors Among Hypertension Patients at Tirunesh Beijing Hospital

Authors: Wudneh Asegedech Ayele

Abstract:

Background: Dietary diversity is strongly related with non-communicable disease (NCDs). Diet plays a key role as a risk factor for hypertension. Diets rich in fruits, vegetables, and low-fat dairy products that include whole grains, poultry, fish, and nuts, that contain only small amounts of red meat, sweets, and sugar-containing beverages, and that contain decreased amounts of total and saturated fat and cholesterol have been found to have a protective effect against hypertension. Methods: hospital based Cross-sectional study design was employed from June 1-June 25, 2021. Sampling technique was Systematic random sampling and data were collected using an interview method. Data were entered into Epi Data version 3.1 and exported to SPSS version 25 for processed and analysis respectively. Descriptive statistics were used to summarize data. Bivariate and multivariate logistic regression will employed to determine dietary diversity among hypertension patients. Results: Adequate dietary diversity score were 96 (24.68%). Most of them cereal, white roots and tubers, dark green leafy vegetables, Vitamin A rich fruits ,meat, egg and coffee or tea more intakes. Hypertensive patients who didn’t consume cereals four times less likely adequate dietary diversity than who consumed cereals [AOR= 4.083, 95%: CI (2.096 -7.352)]. Hypertensive patients who didn’t consume white roots and tubers 14 times less likely adequate dietary diversity than who consumed white roots and tubers [AOR= 13.733, 95% CI: (5.388-34.946)]. Conclusion and recommendation the study showed one of fourth part reported adequate dietary diversity score. Cereals, fruits, vegetables and milk and milk products were statistically associated with dietary diversity practice. Health education about dietary modifications and behavioral change to dietary diversity

Keywords: dietary diversity practice and associated facrors among hypertension patients at tirunesh beijing hospital, hypertension, dietary, diversity and tirunesh beijing hospital, associated facrors among hypertension patient, at tirunesh beijing hospita

Procedia PDF Downloads 14
1927 Quantifying the Impact of Intermittent Signal Priority given to BRT on Ridership and Climate-A Case Study of Ahmadabad

Authors: Smita Chaudhary

Abstract:

Traffic in India are observed uncontrolled, and are characterized by chaotic (not follows the lane discipline) traffic situation. Bus Rapid Transit (BRT) has emerged as a viable option to enhance transportation capacity and provide increased levels of mobility and accessibility. At present in Ahmadabad there are as many intersections which face the congestion and delay at signalized intersection due to transit (BRT) lanes. Most of the intersection in spite of being signalized is operated manually due to the conflict between BRT buses and heterogeneous traffic. Though BRTS in Ahmadabad has an exclusive lane of its own but with this comes certain limitations which Ahmadabad is facing right now. At many intersections in Ahmadabad due to these conflicts, interference, and congestion both heterogeneous traffic as well as transit buses suffer traffic delays of remarkable 3-4 minutes at each intersection which has a become an issue of great concern. There is no provision of BRT bus priority due to which existing signals have their least role to play in managing the traffic that ultimately call for manual operation. There is an immense decrement in the daily ridership of BRTS because people are finding this transit mode no more time saving in their routine, there is an immense fall in ridership ultimately leading to increased number of private vehicles, idling of vehicles at intersection cause air and noise pollution. In order to bring back these commuters’ transit facilities need to be improvised. Classified volume count survey, travel time delay survey was conducted and revised signal design was done for whole study stretch having three intersections and one roundabout, later one intersection was simulated in order to see the effect of giving priority to BRT on side street queue length and travel time for heterogeneous traffic. This paper aims at suggesting the recommendations in signal cycle, introduction of intermittent priority for transit buses, simulation of intersection in study stretch with proposed signal cycle using VISSIM in order to make this transit amenity feasible and attracting for commuters in Ahmadabad.

Keywords: BRT, priority, Ridership, Signal, VISSIM

Procedia PDF Downloads 430
1926 Implementation of Dozer Push Measurement under Payment Mechanism in Mining Operation

Authors: Anshar Ajatasatru

Abstract:

The decline of coal prices over past years have been significantly increasing the awareness of effective mining operation. A viable step must be undertaken in becoming more cost competitive while striving for best mining practice especially at Melak Coal Mine in East Kalimantan, Indonesia. This paper aims to show how effective dozer push measurement method can be implemented as it is controlled by contract rate on the unit basis of USD ($) per bcm. The method emerges from an idea of daily dozer push activity that continually shifts the overburden until final target design by mine planning. Volume calculation is then performed by calculating volume of each time overburden is removed within determined distance using cut and fill method from a high precision GNSS system which is applied into dozer as a guidance to ensure the optimum result of overburden removal. Accumulation of daily to weekly dozer push volume is found 95 bcm which is multiplied by average sell rate of $ 0,95, thus the amount monthly revenue is $ 90,25. Furthermore, the payment mechanism is then based on push distance and push grade. The push distance interval will determine the rates that vary from $ 0,9 - $ 2,69 per bcm and are influenced by certain push slope grade from -25% until +25%. The amount payable rates for dozer push operation shall be specifically following currency adjustment and is to be added to the monthly overburden volume claim, therefore, the sell rate of overburden volume per bcm may fluctuate depends on the real time exchange rate of Jakarta Interbank Spot Dollar Rate (JISDOR). The result indicates that dozer push measurement can be one of the surface mining alternative since it has enabled to refine method of work, operating cost and productivity improvement apart from exposing risk of low rented equipment performance. In addition, payment mechanism of contract rate by dozer push operation scheduling will ultimately deliver clients by almost 45% cost reduction in the form of low and consistent cost.

Keywords: contract rate, cut-fill method, dozer push, overburden volume

Procedia PDF Downloads 298
1925 A Qualitative Study on Exploring How the Home Environment Influences Eating and Physical Activity Habits of Low-Income Latino Children of Predominantly Immigrant Families

Authors: Ana Cristina Lindsay, Sherrie Wallington, Faith Lees, Mary Greaney

Abstract:

Purpose: Latino children in low-income families are at elevated risk of becoming overweight or obese. The purpose of this study was to examine low-income Latino parents’ beliefs, parenting styles and practices related to their children’s eating and physical activity behaviors while at home. Design and Methods: Qualitative study using focus group discussions with 33 low-income Latino parents of preschool children 2 to 5 years of age. Transcripts were analyzed using thematic analysis. Results: Data analyses revealed that most parents recognize the importance of healthy eating and physical activity for their children and themselves. However, daily life demands including conflicting schedules, long working hours, financial constraints, and neighborhood safety concerns, etc., impact parents’ ability to create a home environment supportive of these behaviors. Conclusions: This study provides information about how the home environment influences low-income Latino preschool children’s eating and physical activity habits. This information is useful for pediatric nurses in their health promotion and disease prevention efforts with low-income Latino families with young children, and for the development of home-based and parenting interventions to prevent and control childhood obesity among this population group. Practice Implications: Pediatric nurses can facilitate communication, provide education, and offer guidance to low-income Latino parents that support their children’s development of early healthy eating and physical activity habits, while taking into account daily life barriers faced by families. Moreover, nurses can play an important role in the integration and coordination of home-visitation to complement office-based visits and provide a continuum of care to low-income Latino families.

Keywords: home environment, Latino, obesity, parents, healthy eating, physical activity

Procedia PDF Downloads 272
1924 Study of the Energy Efficiency of Buildings under Tropical Climate with a View to Sustainable Development: Choice of Material Adapted to the Protection of the Environment

Authors: Guarry Montrose, Ted Soubdhan

Abstract:

In the context of sustainable development and climate change, the adaptation of buildings to the climatic context in hot climates is a necessity if we want to improve living conditions in housing and reduce the risks to the health and productivity of occupants due to thermal discomfort in buildings. One can find a wide variety of efficient solutions but with high costs. In developing countries, especially tropical countries, we need to appreciate a technology with a very limited cost that is affordable for everyone, energy efficient and protects the environment. Biosourced insulation is a product based on plant fibers, animal products or products from recyclable paper or clothing. Their development meets the objectives of maintaining biodiversity, reducing waste and protecting the environment. In tropical or hot countries, the aim is to protect the building from solar thermal radiation, a source of discomfort. The aim of this work is in line with the logic of energy control and environmental protection, the approach is to make the occupants of buildings comfortable, reduce their carbon dioxide emissions (CO2) and decrease their energy consumption (energy efficiency). We have chosen to study the thermo-physical properties of banana leaves and sawdust, especially their thermal conductivities, direct measurements were made using the flash method and the hot plate method. We also measured the heat flow on both sides of each sample by the hot box method. The results from these different experiences show that these materials are very efficient used as insulation. We have also conducted a building thermal simulation using banana leaves as one of the materials under Design Builder software. Air-conditioning load as well as CO2 release was used as performance indicator. When the air-conditioned building cell is protected on the roof by banana leaves and integrated into the walls with solar protection of the glazing, it saves up to 64.3% of energy and avoids 57% of CO2 emissions.

Keywords: plant fibers, tropical climates, sustainable development, waste reduction

Procedia PDF Downloads 165
1923 Assessing P0.1 and Occlusion Pressures in Brain-Injured Patients on Pressure Support Ventilation: A Study Protocol

Authors: S. B. R. Slagmulder

Abstract:

Monitoring inspiratory effort and dynamic lung stress in patients on pressure support ventilation in the ICU is important for protecting against self inflicted lung injury (P-SILI) and diaphragm dysfunction. Strategies to address the detrimental effects of respiratory drive and effort can lead to improved patient outcomes. Two non-invasive estimation methods, occlusion pressure (Pocc) and P0.1, have been proposed for achieving lung and diaphragm protective ventilation. However, their relationship and interpretation in neuro ICU patients is not well understood. P0.1 is the airway pressure measured during a 100-millisecond occlusion of the inspiratory port. It reflects the neural drive from the respiratory centers to the diaphragm and respiratory muscles, indicating the patient's respiratory drive during the initiation of each breath. Occlusion pressure, measured during a brief inspiratory pause against a closed airway, provides information about the inspiratory muscles' strength and the system's total resistance and compliance. Research Objective: Understanding the relationship between Pocc and P0.1 in brain-injured patients can provide insights into the interpretation of these values in pressure support ventilation. This knowledge can contribute to determining extubation readiness and optimizing ventilation strategies to improve patient outcomes. The central goal is to asses a study protocol for determining the relationship between Pocc and P0.1 in brain-injured patients on pressure support ventilation and their ability to predict successful extubation. Additionally, comparing these values between brain-damaged and non-brain-damaged patients may provide valuable insights. Key Areas of Inquiry: 1. How do Pocc and P0.1 values correlate within brain injury patients undergoing pressure support ventilation? 2. To what extent can Pocc and P0.1 values serve as predictive indicators for successful extubation in patients with brain injuries? 3. What differentiates the Pocc and P0.1 values between patients with brain injuries and those without? Methodology: P0.1 and occlusion pressures are standard measurements for pressure support ventilation patients, taken by attending doctors as per protocol. We utilize electronic patient records for existing data. Unpaired T-test will be conducted to compare P0.1 and Pocc values between both study groups. Associations between P0.1 and Pocc and other study variables, such as extubation, will be explored with simple regression and correlation analysis. Depending on how the data evolve, subgroup analysis will be performed for patients with and without extubation failure. Results: While it is anticipated that neuro patients may exhibit high respiratory drive, the linkage between such elevation, quantified by P0.1, and successful extubation remains unknown The analysis will focus on determining the ability of these values to predict successful extubation and their potential impact on ventilation strategies. Conclusion: Further research is pending to fully understand the potential of these indices and their impact on mechanical ventilation in different patient populations and clinical scenarios. Understanding these relationships can aid in determining extubation readiness and tailoring ventilation strategies to improve patient outcomes in this specific patient population. Additionally, it is vital to account for the influence of sedatives, neurological scores, and BMI on respiratory drive and occlusion pressures to ensure a comprehensive analysis.

Keywords: brain damage, diaphragm dysfunction, occlusion pressure, p0.1, respiratory drive

Procedia PDF Downloads 52
1922 Assessing Livelihood Vulnerability to Climate Change and Adaptation Strategies in Rajanpur District, Pakistan

Authors: Muhammad Afzal, Shahbaz Mushtaq, Duc-Anh-An-Vo, Kathryn Reardon Smith, Thanh Ma

Abstract:

Climate change has become one of the most challenging environmental issues in the 21st century. Climate change-induced natural disasters, especially floods, are the major factors of livelihood vulnerability, impacting millions of individuals worldwide. Evaluating and mitigating the effects of floods requires an in-depth understanding of the relationship between vulnerability and livelihood capital assets. Using an integrated approach, sustainable livelihood framework, and system thinking approach, the study developed a conceptual model of a generalized livelihood system in District Rajanpur, Pakistan. The model visualizes the livelihood vulnerability system as a whole and identifies the key feedback loops likely to influence the livelihood vulnerability. The study suggests that such conceptual models provide effective communication and understanding tools to stakeholders and decision-makers to anticipate the problem and design appropriate policies. It can also serve as an evaluation technique for rural livelihood policy and identify key systematic interventions. The key finding of the study reveals that household income, health, and education are the major factors behind the livelihood vulnerability of the rural poor of District Rajanpur. The Pakistani government tried to reduce the livelihood vulnerability of the region through different income, health, and education programs, but still, many changes are required to make these programs more effective especially during the flood times. The government provided only cash to vulnerable and marginalized families through income support programs, but this study suggests that along with the cash, the government must provide seed storage facilities and access to crop insurance to the farmers. Similarly, the government should establish basic health units in villages and frequent visits of medical mobile vans should be arranged with advanced medical lab facilities during and after the flood.

Keywords: livelihood vulnerability, rural communities, flood, sustainable livelihood framework, system dynamics, Pakistan

Procedia PDF Downloads 37
1921 Becoming Vegan: The Theory of Planned Behavior and the Moderating Effect of Gender

Authors: Estela Díaz

Abstract:

This article aims to make three contributions. First, build on the literature on ethical decision-making literature by exploring factors that influence the intention of adopting veganism. Second, study the superiority of extended models of the Theory of Planned Behavior (TPB) for understanding the process involved in forming the intention of adopting veganism. Third, analyze the moderating effect of gender on TPB given that attitudes and behavior towards animals are gender-sensitive. No study, to our knowledge, has examined these questions. Veganism is not a diet but a political and moral stand that exclude, for moral reasons, the use of animals. Although there is a growing interest in studying veganism, it continues being overlooked in empirical research, especially within the domain of social psychology. TPB has been widely used to study a broad range of human behaviors, including moral issues. Nonetheless, TPB has rarely been applied to examine ethical decisions about animals and, even less, to veganism. Hence, the validity of TPB in predicting the intention of adopting veganism remains unanswered. A total of 476 non-vegan Spanish university students (55.6% female; the mean age was 23.26 years, SD= 6.1) responded to online and pencil-and-paper self-reported questionnaire based on previous studies. TPB extended models incorporated two background factors: ‘general attitudes towards humanlike-attributes ascribed to animals’ (AHA) (capacity for reason/emotions/suffer, moral consideration, and affect-towards-animals); and ‘general attitudes towards 11 uses of animals’ (AUA). SPSS 22 and SmartPLS 3.0 were used for statistical analyses. This study constructed a second-order reflective-formative model and took the multi-group analysis (MGA) approach to study gender effects. Six models of TPB (the standard and five competing) were tested. No a priori hypotheses were formulated. The results gave partial support to TPB. Attitudes (ATTV) (β = .207, p < .001), subjective norms (SNV) (β = .323, p < .001), and perceived control behavior (PCB) (β = .149, p < .001) had a significant direct effect on intentions (INTV). This model accounted for 27,9% of the variance in intention (R2Adj = .275) and had a small predictive relevance (Q2 = .261). However, findings from this study reveal that contrary to what TPB generally proposes, the effect of the background factors on intentions was not fully mediated by the proximal constructs of intentions. For instance, in the final model (Model#6), both factors had significant multiple indirect effect on INTV (β = .074, 95% C = .030, .126 [AHA:INTV]; β = .101, 95% C = .055, .155 [AUA:INTV]) and significant direct effect on INTV (β = .175, p < .001 [AHA:INTV]; β = .100, p = .003 [AUA:INTV]). Furthermore, the addition of direct paths from background factors to intentions improved the explained variance in intention (R2 = .324; R2Adj = .317) and the predictive relevance (Q2 = .300) over the base-model. This supports existing literature on the superiority of enhanced TPB models to predict ethical issues; which suggests that moral behavior may add additional complexity to decision-making. Regarding gender effect, MGA showed that gender only moderated the influence of AHA on ATTV (e.g., βWomen−βMen = .296, p < .001 [Model #6]). However, other observed gender differences (e.g. the explained variance of the model for intentions were always higher for men that for women, for instance, R2Women = .298; R2Men = .394 [Model #6]) deserve further considerations, especially for developing more effective communication strategies.

Keywords: veganism, Theory of Planned Behavior, background factors, gender moderation

Procedia PDF Downloads 330
1920 Ultrasound Disintegration as a Potential Method for the Pre-Treatment of Virginia Fanpetals (Sida hermaphrodita) Biomass before Methane Fermentation Process

Authors: Marcin Dębowski, Marcin Zieliński, Mirosław Krzemieniewski

Abstract:

As methane fermentation is a complex series of successive biochemical transformations, its subsequent stages are determined, to a various extent, by physical and chemical factors. A specific state of equilibrium is being settled in the functioning fermentation system between environmental conditions and the rate of biochemical reactions and products of successive transformations. In the case of physical factors that influence the effectiveness of methane fermentation transformations, the key significance is ascribed to temperature and intensity of biomass agitation. Among the chemical factors, significant are pH value, type, and availability of the culture medium (to put it simply: the C/N ratio) as well as the presence of toxic substances. One of the important elements which influence the effectiveness of methane fermentation is the pre-treatment of organic substrates and the mode in which the organic matter is made available to anaerobes. Out of all known and described methods for organic substrate pre-treatment before methane fermentation process, the ultrasound disintegration is one of the most interesting technologies. Investigations undertaken on the ultrasound field and the use of installations operating on the existing systems result principally from very wide and universal technological possibilities offered by the sonication process. This physical factor may induce deep physicochemical changes in ultrasonicated substrates that are highly beneficial from the viewpoint of methane fermentation processes. In this case, special role is ascribed to disintegration of biomass that is further subjected to methane fermentation. Once cell walls are damaged, cytoplasm and cellular enzymes are released. The released substances – either in dissolved or colloidal form – are immediately available to anaerobic bacteria for biodegradation. To ensure the maximal release of organic matter from dead biomass cells, disintegration processes are aimed to achieve particle size below 50 μm. It has been demonstrated in many research works and in systems operating in the technical scale that immediately after substrate supersonication the content of organic matter (characterized by COD, BOD5 and TOC indices) was increasing in the dissolved phase of sedimentation water. This phenomenon points to the immediate sonolysis of solid substances contained in the biomass and to the release of cell material, and consequently to the intensification of the hydrolytic phase of fermentation. It results in a significant reduction of fermentation time and increased effectiveness of production of gaseous metabolites of anaerobic bacteria. Because disintegration of Virginia fanpetals biomass via ultrasounds applied in order to intensify its conversion is a novel technique, it is often underestimated by exploiters of agri-biogas works. It has, however, many advantages that have a direct impact on its technological and economical superiority over thus far applied methods of biomass conversion. As for now, ultrasound disintegrators for biomass conversion are not produced on the mass-scale, but by specialized groups in scientific or R&D centers. Therefore, their quality and effectiveness are to a large extent determined by their manufacturers’ knowledge and skills in the fields of acoustics and electronic engineering.

Keywords: ultrasound disintegration, biomass, methane fermentation, biogas, Virginia fanpetals

Procedia PDF Downloads 354
1919 PolyScan: Comprehending Human Polymicrobial Infections for Vector-Borne Disease Diagnostic Purposes

Authors: Kunal Garg, Louise Theusen Hermansan, Kanoktip Puttaraska, Oliver Hendricks, Heidi Pirttinen, Leona Gilbert

Abstract:

The Germ Theory (one infectious determinant is equal to one disease) has unarguably evolved our capability to diagnose and treat infectious diseases over the years. Nevertheless, the advent of technology, climate change, and volatile human behavior has brought about drastic changes in our environment, leading us to question the relevance of the Germ Theory in our day, i.e. will vector-borne disease (VBD) sufferers produce multiple immune responses when tested for multiple microbes? Vector diseased patients producing multiple immune responses to different microbes would evidently suggest human polymicrobial infections (HPI). Ongoing diagnostic tools are exceedingly unequipped with the current research findings that would aid in diagnosing patients for polymicrobial infections. This shortcoming has caused misdiagnosis at very high rates, consequently diminishing the patient’s quality of life due to inadequate treatment. Equipped with the state-of-art scientific knowledge, PolyScan intends to address the pitfalls in current VBD diagnostics. PolyScan is a multiplex and multifunctional enzyme linked Immunosorbent assay (ELISA) platform that can test for numerous VBD microbes and allow simultaneous screening for multiple types of antibodies. To validate PolyScan, Lyme Borreliosis (LB) and spondyloarthritis (SpA) patient groups (n = 54 each) were tested for Borrelia burgdorferi, Borrelia burgdorferi Round Body (RB), Borrelia afzelii, Borrelia garinii, and Ehrlichia chaffeensis against IgM and IgG antibodies. LB serum samples were obtained from Germany and SpA serum samples were obtained from Denmark under relevant ethical approvals. The SpA group represented chronic LB stage because reactive arthritis (SpA subtype) in the form of Lyme arthritis links to LB. It was hypothesized that patients from both the groups will produce multiple immune responses that as a consequence would evidently suggest HPI. It was also hypothesized that the multiple immune response proportion in SpA patient group would be significantly larger when compared to the LB patient group across both antibodies. It was observed that 26% LB patients and 57% SpA patients produced multiple immune responses in contrast to 33% LB patients and 30% SpA patients that produced solitary immune responses when tested against IgM. Similarly, 52% LB patients and an astounding 73% SpA patients produced multiple immune responses in contrast to 30% LB patients and 8% SpA patients that produced solitary immune responses when tested against IgG. Interestingly, IgM immune dysfunction in both the patient groups was also recorded. Atypically, 6% of the unresponsive 18% LB with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Similarly, 12% of the unresponsive 19% SpA with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Thus, results not only supported hypothesis but also suggested that IgM may atypically prevail longer than IgG. The PolyScan concept will aid clinicians to detect patients for early, persistent, late, polymicrobial, & immune dysfunction conditions linked to different VBD. PolyScan provides a paradigm shift for the VBD diagnostic industry to follow that will drastically shorten patient’s time to receive adequate treatment.

Keywords: diagnostics, immune dysfunction, polymicrobial, TICK-TAG

Procedia PDF Downloads 314
1918 Accessing Properties of Alkali Activated Ground Granulated Blast Furnace Slag Based Self Compacting Geopolymer Concrete Incorporating Nano Silica

Authors: Guneet Saini, Uthej Vattipalli

Abstract:

In a world with increased demand for sustainable construction, waste product of one industry could be a boon to the other in reducing the carbon footprint. Usage of industrial waste such as fly ash and ground granulated blast furnace slag have become the epicenter of curbing the use of cement, one of the major contributors of greenhouse gases. In this paper, empirical studies have been done to develop alkali activated self-compacting geopolymer concrete (GPC) using ground granulated blast furnace slag (GGBS), incorporated with 2% nano-silica by weight, through evaluation of its fresh and hardening properties. Experimental investigation on 6 mix designs of varying molarity of 10M, 12M and 16M of the alkaline solution and a binder content of 450 kg/m³ and 500 kg/m³ has been done and juxtaposed with GPC mix design composed of 16M alkaline solution concentration and 500 kg/m³ binder content without nano-silica. The sodium silicate to sodium hydroxide ratio (SS/SH), alkaline activator liquid to binder ratio (AAL/B) and water to binder ratio (W/B), which significantly affect the performance and mechanical properties of GPC, were fixed at 2.5, 0.45 and 0.4 respectively. To catalyze the early stage geopolymerisation, oven curing is done maintaining the temperature at 60˚C. This paper also elucidates the test results for fresh self-compacting concrete (SCC) done as per EFNARC guidelines. The mechanical properties tests conducted were: compressive strength test after 7 days, 28 days, 56 days and 90 days; flexure test; split tensile strength test after 28 days, 56 days and 90 days; X-ray diffraction test to analyze the mechanical performance and sorptivity test for testing of permeability. The study revealed that the sample of 16M concentration of alkaline solution with 500 Kg/m³ binder content containing 2% nano silica produced the highest compressive, flexural and split tensile strength of 81.33 MPa, 7.875 MPa, and 6.398 MPa respectively, at the end of 90 days.

Keywords: alkaline activator liquid, geopolymer concrete, ground granulated blast furnace slag, nano silica, self compacting

Procedia PDF Downloads 130
1917 Field Evaluation of Different Aubergine Cultivars against Infestation of Brinjal Shoot and Fruit Borer

Authors: Ajmal Khan Kassi, Humayun Javed, Muhammad Asif Aziz

Abstract:

Response of different aubergine cultivars against Brinjal shoot and fruit borer (Leucinodes orbonalis Guenee.) was evaluated at research farm of PMAS, Arid Agriculture University, Rawalpindi, during 2013. Field trials were conducted in randomized completed block design with four replications for the screening of five cultivars of Brinjal (Solanum melongena L) (Short Purpal, Singhnath 666, Brinjal long 6275, Round Brinjal 86602, Round Egg Plant White). Cultivar Round White Brinjal showed maximum fruit infestation (54.44%) followed by Singhnath 666 (53.19%), while minimum fruit infestation was observed in Round Brinjal 86602 (42.39%). Cultivar Short Purpal showed maximum larval population (0.43) followed by Round White Brinjal (0.39), while the minimum larval population was observed in Round Brinjal 86602 with (0.27). It was observed that Round Brinjal 86602 cultivar showed comparatively minimum (L. orbonalis) larval population per leaf. The correlation of Brinjal fruit infestation and larval population of (L. orbonalis) with the different environmental factors showed that, the average relative humidity was positively and significantly correlated with fruit infestation on cultivars average precipitation showed positive but non- significant correlation on all the cultivars except Singhnath 666 with the value of (0.79) which was positive and significant. The average temperature showed non-significant and negative correlation with Brinjal long 6275, Round Brinjal 86602 and Singhnath 666, but significant negative correlation with Short Purpal and Round White Brinjal. Maximum temperature also showed the significant and negative correlation on all the five Brinjal cultivars which were significant and highly significant. Minimum temperature showed negative correlation and not significant correlation with all the cultivars. Consequently, based on the (L. orbonalis) larval density and Brinjal fruit infestation, the Round Brinjal 86602 proved least susceptible and Short Purpal highly susceptible cultivar.

Keywords: evaluation, Brinjal (Solanum melongena L), Cultivars, L. orbonalis

Procedia PDF Downloads 180
1916 A Methodological Approach to Digital Engineering Adoption and Implementation for Organizations

Authors: Sadia H. Syeda, Zain H. Malik

Abstract:

As systems continue to become more complex and the interdependencies of processes and sub-systems continue to grow and transform, the need for a comprehensive method of tracking and linking the lifecycle of the systems in a digital form becomes ever more critical. Digital Engineering (DE) provides an approach to managing an authoritative data source that links, tracks, and updates system data as it evolves and grows throughout the system development lifecycle. DE enables the developing, tracking, and sharing system data, models, and other related artifacts in a digital environment accessible to all necessary stakeholders. The DE environment provides an integrated electronic repository that enables traceability between design, engineering, and sustainment artifacts. The DE activities' primary objective is to develop a set of integrated, coherent, and consistent system models for the program. It is envisioned to provide a collaborative information-sharing environment for various stakeholders, including operational users, acquisition personnel, engineering personnel, and logistics and sustainment personnel. Examining the processes that DE can support in the systems engineering life cycle (SELC) is a primary step in the DE adoption and implementation journey. Through an analysis of the U.S Department of Defense’s (DoD) Office of the Secretary of Defense (OSD’s) Digital Engineering Strategy and their implementation, examples of DE implementation by the industry and technical organizations, this paper will provide descriptions of the current DE processes and best practices of implementing DE across an enterprise. This will help identify the capabilities, environment, and infrastructure needed to develop a potential roadmap for implementing DE practices consistent with its business strategy. A capability maturity matrix will be provided to assess the organization’s DE maturity emphasizing how all the SELC elements interlink to form a cohesive ecosystem. If implemented, DE can increase efficiency and improve the systems engineering processes' quality and outcomes.

Keywords: digital engineering, digital environment, digital maturity model, single source of truth, systems engineering life-cycle

Procedia PDF Downloads 79
1915 A Quasi-Experimental Study of the Impact of 5Es Instructional Model on Students' Mathematics Achievement in Northern Province, Rwanda

Authors: Emmanuel Iyamuremye, Jean François Maniriho, Irenee Ndayambaje

Abstract:

Mathematics is the foundational enabling discipline that underpins science, technology, and engineering disciplines. Science, technology, engineering, and mathematics (STEM) subjects are foreseen as the engine for socio-economic transformation. Rwanda has done reforms in education aiming at empowering and preparing students for the real world job by providing career pathways in science, technology, engineering, and mathematics related fields. While that considered so, the performance in mathematics has remained deplorable in both formative and national examinations. Therefore, this paper aims at exploring the extent to which the engage, explore, explain, elaborate and evaluate (5Es) instructional model contributing towards students’ achievement in mathematics. The present study adopted the pre-test, post-test non-equivalent control group quasi-experimental design. The 5Es instructional model was applied to the experimental group while the control group received instruction with the conventional teaching method for eight weeks. One research-made instrument, mathematics achievement test (MAT), was used for data collection. A pre-test was given to students before the intervention to make sure that both groups have equivalent characteristics. At the end of the experimental period, the two groups have undergone a post-test to ascertain the contribution of the 5Es instructional model. Descriptive statistics and analysis of covariance (ANCOVA) were used for the analysis of the study. For determining the improvement in mathematics, Hakes methods of calculating gain were used to analyze the pre-test and post-test scores. Results showed that students exposed to 5Es instructional model achieved significantly better performance in mathematics than students instructed using the conventional teaching method. It was also found that 5Es instructional model made lessons more interesting, easy and created friendship among students. Thus, 5Es instructional model was recommended to be adopted as a close substitute to the conventional teaching method in teaching mathematics in lower secondary schools in Rwanda.

Keywords: 5Es instructional model, achievement, conventional teaching method, mathematics

Procedia PDF Downloads 91