Search results for: significant risk factors in megaprojects
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26793

Search results for: significant risk factors in megaprojects

1323 Governance in the Age of Artificial intelligence and E- Government

Authors: Mernoosh Abouzari, Shahrokh Sahraei

Abstract:

Electronic government is a way for governments to use new technology that provides people with the necessary facilities for proper access to government information and services, improving the quality of services and providing broad opportunities to participate in democratic processes and institutions. That leads to providing the possibility of easy use of information technology in order to distribute government services to the customer without holidays, which increases people's satisfaction and participation in political and economic activities. The expansion of e-government services and its movement towards intelligentization has the ability to re-establish the relationship between the government and citizens and the elements and components of the government. Electronic government is the result of the use of information and communication technology (ICT), which by implementing it at the government level, in terms of the efficiency and effectiveness of government systems and the way of providing services, tremendous commercial changes are created, which brings people's satisfaction at the wide level will follow. The main level of electronic government services has become objectified today with the presence of artificial intelligence systems, which recent advances in artificial intelligence represent a revolution in the use of machines to support predictive decision-making and Classification of data. With the use of deep learning tools, artificial intelligence can mean a significant improvement in the delivery of services to citizens and uplift the work of public service professionals while also inspiring a new generation of technocrats to enter government. This smart revolution may put aside some functions of the government, change its components, and concepts such as governance, policymaking or democracy will change in front of artificial intelligence technology, and the top-down position in governance may face serious changes, and If governments delay in using artificial intelligence, the balance of power will change and private companies will monopolize everything with their pioneering in this field, and the world order will also depend on rich multinational companies and in fact, Algorithmic systems will become the ruling systems of the world. It can be said that currently, the revolution in information technology and biotechnology has been started by engineers, large economic companies, and scientists who are rarely aware of the political complexities of their decisions and certainly do not represent anyone. Therefore, it seems that if liberalism, nationalism, or any other religion wants to organize the world of 2050, it should not only rationalize the concept of artificial intelligence and complex data algorithm but also mix them in a new and meaningful narrative. Therefore, the changes caused by artificial intelligence in the political and economic order will lead to a major change in the way all countries deal with the phenomenon of digital globalization. In this paper, while debating the role and performance of e-government, we will discuss the efficiency and application of artificial intelligence in e-government, and we will consider the developments resulting from it in the new world and the concepts of governance.

Keywords: electronic government, artificial intelligence, information and communication technology., system

Procedia PDF Downloads 80
1322 Molecular Detection of E. coli in Treated Wastewater and Well Water Samples Collected from Al Riyadh Governorate, Saudi Arabia

Authors: Hanouf A. S. Al Nuwaysir, Nadine Moubayed, Abir Ben Bacha, Islem Abid

Abstract:

Consumption of waste water continues to cause significant problems for human health in both developed and developing countries. Many regulations have been implied by different world authorities controlling water quality for the presence of coliforms used as standard indicators of water quality deterioration and historically leading health protection concept. In this study, the European directive for the detection of Escherichia coli, ISO 9308-1, was applied to examine and monitor coliforms in water samples collected from Wadi Hanifa and neighboring wells, Riyadh governorate, kingdom of Saudi Arabia, which is used for irrigation and industrial purposes. Samples were taken from different locations for 8 months consecutively, chlorine concentration ranging from 0.1- 0.4 mg/l, was determined using the DPD FREE CHLORINE HACH kit. Water samples were then analyzed following the ISO protocol which relies on the membrane filtration technique (0.45µm, pore size membrane filter) and a chromogenic medium TTC, a lactose based medium used for the detection and enumeration of total coliforms and E.coli. Data showed that the number of bacterial isolates ranged from 60 to 300 colonies/100ml for well and surface water samples respectively; where higher numbers were attributed to the surface samples. Organisms which apparently ferment lactose on TTC agar plates, appearing as orange colonies, were selected and additionally cultured on EMB and MacConkey agar for a further differentiation among E.coli and coliform bacteria. Two additional biochemical tests (Cytochrome oxidase and indole from tryptophan) were also investigated to detect and differentiate the presence of E.coli from other coliforms, E. coli was identified in an average of 5 to 7colonies among 25 selected colonies.On the other hand, a more rapid, specific and sensitive analytical molecular detection namely single colony PCR was also performed targeting hha gene to sensitively detect E.coli, giving more accurate and time consuming identification of colonies considered presumptively as E.coli. Comparative methodologies, such as ultrafiltration and direct DNA extraction from membrane filters (MoBio, Grermany) were also applied; however, results were not as accurate as the membrane filtration, making it a technique of choice for the detection and enumeration of water coliforms, followed by sufficiently specific enzymatic confirmatory stage.

Keywords: coliform, cytochrome oxidase, hha primer, membrane filtration, single colony PCR

Procedia PDF Downloads 307
1321 Synthesis of MIPs towards Precursors and Intermediates of Illicit Drugs and Their following Application in Sensing Unit

Authors: K. Graniczkowska, N. Beloglazova, S. De Saeger

Abstract:

The threat of synthetic drugs is one of the most significant current drug problems worldwide. The use of drugs of abuse has increased dramatically during the past three decades. Among others, Amphetamine-Type Stimulants (ATS) are globally the second most widely used drugs after cannabis, exceeding the use of cocaine and heroin. ATS are potent central nervous system (CNS) stimulants, capable of inducing euphoric static similar to cocaine. Recreational use of ATS is widespread, even though warnings of irreversible damage of the CNS were reported. ATS pose a big problem and their production contributes to the pollution of the environment by discharging big volumes of liquid waste to sewage system. Therefore, there is a demand to develop robust and sensitive sensors that can detect ATS and their intermediates in environmental water samples. A rapid and simple test is required. Analysis of environmental water samples (which sometimes can be a harsh environment) using antibody-based tests cannot be applied. Therefore, molecular imprinted polymers (MIPs), which are known as synthetic antibodies, have been chosen for that approach. MIPs are characterized with a high mechanical and thermal stability, show chemical resistance in a broad pH range and various organic or aqueous solvents. These properties make them the preferred type of receptors for application in the harsh conditions imposed by environmental samples. To the best of our knowledge, there are no existing MIPs-based sensors toward amphetamine and its intermediates. Also not many commercial MIPs for this application are available. Therefore, the aim of this study was to compare different techniques to obtain MIPs with high specificity towards ATS and characterize them for following use in a sensing unit. MIPs against amphetamine and its intermediates were synthesized using a few different techniques, such as electro-, thermo- and UV-initiated polymerization. Different monomers, cross linkers and initiators, in various ratios, were tested to obtain the best sensitivity and polymers properties. Subsequently, specificity and selectivity were compared with commercially available MIPs against amphetamine. Different linkers, such as lipoic acid, 3-mercaptopioponic acid and tyramine were examined, in combination with several immobilization techniques, to select the best procedure for attaching particles on sensor surface. Performed experiments allowed choosing an optimal method for the intended sensor application. Stability of MIPs in extreme conditions, such as highly acidic or basic was determined. Obtained results led to the conclusion about MIPs based sensor applicability in sewage system testing.

Keywords: amphetamine type stimulants, environment, molecular imprinted polymers, MIPs, sensor

Procedia PDF Downloads 238
1320 Modelling Spatial Dynamics of Terrorism

Authors: André Python

Abstract:

To this day, terrorism persists as a worldwide threat, exemplified by the recent deadly attacks in January 2015 in Paris and the ongoing massacres perpetrated by ISIS in Iraq and Syria. In response to this threat, states deploy various counterterrorism measures, the cost of which could be reduced through effective preventive measures. In order to increase the efficiency of preventive measures, policy-makers may benefit from accurate predictive models that are able to capture the complex spatial dynamics of terrorism occurring at a local scale. Despite empirical research carried out at country-level that has confirmed theories explaining the diffusion processes of terrorism across space and time, scholars have failed to assess diffusion’s theories on a local scale. Moreover, since scholars have not made the most of recent statistical modelling approaches, they have been unable to build up predictive models accurate in both space and time. In an effort to address these shortcomings, this research suggests a novel approach to systematically assess the theories of terrorism’s diffusion on a local scale and provide a predictive model of the local spatial dynamics of terrorism worldwide. With a focus on the lethal terrorist events that occurred after 9/11, this paper addresses the following question: why and how does lethal terrorism diffuse in space and time? Based on geolocalised data on worldwide terrorist attacks and covariates gathered from 2002 to 2013, a binomial spatio-temporal point process is used to model the probability of terrorist attacks on a sphere (the world), the surface of which is discretised in the form of Delaunay triangles and refined in areas of specific interest. Within a Bayesian framework, the model is fitted through an integrated nested Laplace approximation - a recent fitting approach that computes fast and accurate estimates of posterior marginals. Hence, for each location in the world, the model provides a probability of encountering a lethal terrorist attack and measures of volatility, which inform on the model’s predictability. Diffusion processes are visualised through interactive maps that highlight space-time variations in the probability and volatility of encountering a lethal attack from 2002 to 2013. Based on the previous twelve years of observation, the location and lethality of terrorist events in 2014 are statistically accurately predicted. Throughout the global scope of this research, local diffusion processes such as escalation and relocation are systematically examined: the former process describes an expansion from high concentration areas of lethal terrorist events (hotspots) to neighbouring areas, while the latter is characterised by changes in the location of hotspots. By controlling for the effect of geographical, economical and demographic variables, the results of the model suggest that the diffusion processes of lethal terrorism are jointly driven by contagious and non-contagious factors that operate on a local scale – as predicted by theories of diffusion. Moreover, by providing a quantitative measure of predictability, the model prevents policy-makers from making decisions based on highly uncertain predictions. Ultimately, this research may provide important complementary tools to enhance the efficiency of policies that aim to prevent and combat terrorism.

Keywords: diffusion process, terrorism, spatial dynamics, spatio-temporal modeling

Procedia PDF Downloads 337
1319 Differentiation of Drug Stereoisomers by Their Stereostructure-Selective Membrane Interactions as One of Pharmacological Mechanisms

Authors: Maki Mizogami, Hironori Tsuchiya, Yoshiroh Hayabuchi, Kenji Shigemi

Abstract:

Since drugs exhibit significant structure-dependent differences in activity and toxicity, their differentiation based on the mechanism of action should have implications for comparative drug efficacy and safety. We aimed to differentiate drug stereoisomers by their stereostructure-selective membrane interactions underlying pharmacological and toxicological effects. Biomimetic lipid bilayer membranes were prepared with phospholipids and sterols (either cholesterol or epicholesterol) to mimic the lipid compositions of neuronal and cardiomyocyte membranes and to provide these membranes with the chirality. The membrane preparations were treated with different classes of stereoisomers at clinically- and pharmacologically-relevant concentrations (25-200 μM), followed by measuring fluorescence polarization to determine the membrane interactivity of drugs to change the physicochemical property of membranes. All the tested drugs acted on lipid bilayers to increase or decrease the membrane fluidity. Drug stereoisomers could not be differentiated when interacting with the membranes consisting of phospholipids alone. However, they stereostructure-selectively interacted with neuro-mimetic and cardio-mimetic membranes containing 40 mol% cholesterol ((3β)-cholest-5-en-3-ol) to show the relative potencies being local anesthetic R(+)-bupivacaine > rac-bupivacaine > S(‒)-bupivacaine, α2-adrenergic agonistic D-medetomidine > rac-medetomidine > L-medetomidine, β-adrenergic antagonistic R(+)-propranolol > rac-propranolol > S(–)-propranolol, NMDA receptor antagonistic S(+)-ketamine > rac-ketamine, analgesic monoterpenoid (+)-menthol > (‒)-menthol, non-steroidal anti-inflammatory S(+)-ibuprofen > rac-ibuprofen > R(‒)-ibuprofen, and bioactive flavonoid (+)-epicatechin > (‒)-epicatechin. All of the order of membrane interactivity were correlated to those of beneficial and adverse effects of the tested stereoisomers. In contrast, the membranes prepared with epicholesterol ((3α)-chotest-5-en-3-ol), an epimeric form of cholesterol, reversed the rank order of membrane interactivity to be S(‒)-enantiomeric > racemic > R(+)-enantiomeric bupivacaine, L-enantiomeric > racemic > D-enantiomeric medetomidine, S(–)-enantiomeric > racemic > R(+)-enantiomeric propranolol, racemic > S(+)-enantiomeric ketamine, (‒)-enantiomeric > (+)-enantiomeric menthol, R(‒)-enantiomeric > racemic > S(+)-enantiomeric ibuprofen, and (‒)-enantiomeric > (+)-enantiomeric epicatechin. The opposite configuration allows drug molecules to interact with chiral sterol membranes enantiomer-selectively. From the comparative results, it is speculated that a 3β-hydroxyl group in cholesterol is responsible for the enantioselective interactions of drugs. In conclusion, the differentiation of drug stereoisomers by their stereostructure-selective membrane interactions would be useful for designing and predicting drugs with higher activity and/or lower toxicity.

Keywords: chiral membrane, differentiation, drug stereoisomer, enantioselective membrane interaction

Procedia PDF Downloads 210
1318 Digitization and Economic Growth in Africa: The Role of Financial Sector Development

Authors: Abdul Ganiyu Iddrisu, Bei Chen

Abstract:

Digitization is the process of transforming analog material into digital form, especially for storage and use in a computer. Significant development of information and communication technology (ICT) over the past years has encouraged many researchers to investigate its contribution to promoting economic growth and reducing poverty. Yet the compelling empirical evidence on the effects of digitization on economic growth remains weak, particularly in Africa. This is because extant studies that explicitly evaluate digitization and economic growth nexus are mostly reports and desk reviews. This points out an empirical knowledge gap in the literature. Hypothetically, digitization influences financial sector development which in turn influences economic growth. Digitization has changed the financial sector and its operating environment. Obstacles to access to financing, for instance, physical distance, minimum balance requirements, and low-income flows, among others can be circumvented. Savings have increased, micro-savers have opened bank accounts, and banks are now able to price short-term loans. This has the potential to develop the financial sector. However, empirical evidence on the digitization-financial development nexus is dearth. On the other hand, a number of studies maintained that financial sector development greatly influences growth of economies. We, therefore, argue that financial sector development is one of the transmission mechanisms through which digitization affects economic growth. Employing macro-country-level data from African countries and using fixed effects, random effects and Hausman-Taylor estimation approaches, this paper contributes to the literature by analysing economic growth in Africa, focusing on the role of digitization and financial sector development. First, we assess how digitization influences financial sector development in Africa. From an economic policy perspective, it is important to identify digitization determinants of financial sector development so that action can be taken to reduce the economic shocks associated with financial sector distortions. This nexus is rarely examined empirically in the literature. Secondly, we examine the effect of domestic credit to the private sector and stock market capitalization as a percentage of GDP as used to proxy for financial sector development on economic growth. Digitization is represented by the volume of digital/ICT equipment imported and GDP growth is used to proxy economic growth. Finally, we examine the effect of digitization on economic growth in the light of financial sector development. The following key results were found; first, digitalization propels financial sector development in Africa. Second, financial sector development enhances economic growth. Finally, contrary to our expectation, the results also indicate that digitalization conditioned on financial sector development tends to reduce economic growth in Africa. However, results of the net effects suggest that digitalization, overall, improve economic growth in Africa. We, therefore, conclude that, digitalization in Africa does not only develop the financial sector but unconditionally contributes the growth of the continent’s economies.

Keywords: digitalization, financial sector development, Africa, economic growth

Procedia PDF Downloads 121
1317 Security Issues in Long Term Evolution-Based Vehicle-To-Everything Communication Networks

Authors: Mujahid Muhammad, Paul Kearney, Adel Aneiba

Abstract:

The ability for vehicles to communicate with other vehicles (V2V), the physical (V2I) and network (V2N) infrastructures, pedestrians (V2P), etc. – collectively known as V2X (Vehicle to Everything) – will enable a broad and growing set of applications and services within the intelligent transport domain for improving road safety, alleviate traffic congestion and support autonomous driving. The telecommunication research and industry communities and standardization bodies (notably 3GPP) has finally approved in Release 14, cellular communications connectivity to support V2X communication (known as LTE – V2X). LTE – V2X system will combine simultaneous connectivity across existing LTE network infrastructures via LTE-Uu interface and direct device-to-device (D2D) communications. In order for V2X services to function effectively, a robust security mechanism is needed to ensure legal and safe interaction among authenticated V2X entities in the LTE-based V2X architecture. The characteristics of vehicular networks, and the nature of most V2X applications, which involve human safety makes it significant to protect V2X messages from attacks that can result in catastrophically wrong decisions/actions include ones affecting road safety. Attack vectors include impersonation attacks, modification, masquerading, replay, MiM attacks, and Sybil attacks. In this paper, we focus our attention on LTE-based V2X security and access control mechanisms. The current LTE-A security framework provides its own access authentication scheme, the AKA protocol for mutual authentication and other essential cryptographic operations between UEs and the network. V2N systems can leverage this protocol to achieve mutual authentication between vehicles and the mobile core network. However, this protocol experiences technical challenges, such as high signaling overhead, lack of synchronization, handover delay and potential control plane signaling overloads, as well as privacy preservation issues, which cannot satisfy the adequate security requirements for majority of LTE-based V2X services. This paper examines these challenges and points to possible ways by which they can be addressed. One possible solution, is the implementation of the distributed peer-to-peer LTE security mechanism based on the Bitcoin/Namecoin framework, to allow for security operations with minimal overhead cost, which is desirable for V2X services. The proposed architecture can ensure fast, secure and robust V2X services under LTE network while meeting V2X security requirements.

Keywords: authentication, long term evolution, security, vehicle-to-everything

Procedia PDF Downloads 154
1316 Support Services in Open and Distance Education: An Integrated Model of Open Universities

Authors: Evrim Genc Kumtepe, Elif Toprak, Aylin Ozturk, Gamze Tuna, Hakan Kilinc, Irem Aydin Menderis

Abstract:

Support services are very significant elements for all educational institutions in general; however, for distance learners, these services are more essential than traditional (face-to-face) counterparts. One of the most important reasons for this is that learners and instructors do not share the same physical environment and that distance learning settings generally require intrapersonal interactions rather than interpersonal ones. Some learners in distance learning programs feel isolated. Furthermore, some fail to feel a sense of belonging to the institution because of lack of self-management skills, lack of motivation levels, and the need of being socialized, so that they are more likely to fail or drop out of an online class. In order to overcome all these problems, support services have emerged as a critical element for an effective and sustainable distance education system. Within the context of distance education support services, it is natural to include technology-based and web-based services and also the related materials. Moreover, institutions in education sector are expected to use information and communication technologies effectively in order to be successful in educational activities and programs. In terms of the sustainability of the system, an institution should provide distance education services through ICT enabled processes to support all stakeholders in the system, particularly distance learners. In this study, it is envisaged to develop a model based on the current support services literature in the field of open and distance learning and the applications of the distance higher education institutions. Specifically, content analysis technique is used to evaluate the existing literature in the distance education support services, the information published on websites, and applications of distance higher education institutions across the world. A total of 60 institutions met the inclusion criteria which are language option (English) and availability of materials in the websites. The six field experts contributed to brainstorming process to develop and extract codes for the coding scheme. During the coding process, these preset and emergent codes are used to conduct analyses. Two coders independently reviewed and coded each assigned website to ensure that all coders are interpreting the data the same way and to establish inter-coder reliability. Once each web page is included in descriptive and relational analysis, a model of support services is developed by examining the generated codes and themes. It is believed that such a model would serve as a quality guide for future institutions, as well as the current ones.

Keywords: support services, open education, distance learning, support model

Procedia PDF Downloads 184
1315 Immersive Environment as an Occupant-Centric Tool for Architecture Criticism and Architectural Education

Authors: Golnoush Rostami, Farzam Kharvari

Abstract:

In recent years, developments in the field of architectural education have resulted in a shift from conventional teaching methods to alternative state-of-the-art approaches in teaching methods and strategies. Criticism in architecture has been a key player both in the profession and education, but it has been mostly offered by renowned individuals. Hence, not only students or other professionals but also critics themselves may not have the option to experience buildings and rely on available 2D materials, such as images and plans, that may not result in a holistic understanding and evaluation of buildings. On the other hand, immersive environments provide students and professionals the opportunity to experience buildings virtually and reflect their evaluation by experiencing rather than judging based on 2D materials. Therefore, the aim of this study is to compare the effect of experiencing buildings in immersive environments and 2D drawings, including images and plans, on architecture criticism and architectural education. As a result, three buildings that have parametric brick facades were studied through 2D materials and in Unreal Engine v. 24 as an immersive environment among 22 architecture students that were selected using convenient sampling and were divided into two equal groups using simple random sampling. This study used mixed methods, including quantitative and qualitative methods; the quantitative section was carried out by a questionnaire, and deep interviews were used for the qualitative section. A questionnaire was developed for measuring three constructs, including privacy regulation based on Altman’s theory, the sufficiency of illuminance levels in the building, and the visual status of the view (visually appealing views based on obstructions that may have been caused by facades). Furthermore, participants had the opportunity to reflect their understanding and evaluation of the buildings in individual interviews. Accordingly, the collected data from the questionnaires were analyzed using independent t-test and descriptive analyses in IBM SPSS Statistics v. 26, and interviews were analyzed using the content analysis method. The results of the interviews showed that the participants who experienced the buildings in the immersive environment were able to have a thorough and more precise evaluation of the buildings in comparison to those who studied them through 2D materials. Moreover, the analyses of the respondents’ questionnaires revealed that there were statistically significant differences between measured constructs among the two groups. The outcome of this study suggests that integrating immersive environments into the profession and architectural education as an effective and efficient tool for architecture criticism is vital since these environments allow users to have a holistic evaluation of buildings for vigorous and sound criticism.

Keywords: immersive environments, architecture criticism, architectural education, occupant-centric evaluation, pre-occupancy evaluation

Procedia PDF Downloads 117
1314 Use of Psychiatric Services and Psychotropics in Children with Atopic Dermatitis

Authors: Mia Schneeweiss, Joseph Merola

Abstract:

Atopic dermatitis (AD) is a chronic inflammatory skin condition with a prevalence of 9.6 million in children under the age of 18 in the US, 3.2 million of those suffer severe AD. AD has significant effects on the quality of life and psychiatric comorbidity in affected patients. We sought to quantify the use of psychotropic medications and mental health services in children. We used longitudinal claims data form commercially insured patients in the US between 2003 and 2016 to identify children aged 18 or younger with a diagnosis of AD associated with an outpatient or inpatient encounter. A 180-day enrollment period was required before the first diagnosis of AD. Among those diagnosed, we computed the use of psychiatric services and dispensing of psychotropic medications during the following 6 months. Among 1.6 million children <18 years with a diagnosis of AD, most were infants (0-1 years: 17.6%), babies (1-2 years: 12.2%) and young children (2-4 years: 15.4). 5.1% were in age group 16-18 years. Among younger children 50% of patients were female, after the age of 14 about 60% were female. In 16-18 years olds 6.4% had at least one claim with a recorded psychopathology during the 6-month baseline period; 4.6% had depression, 3.3% anxiety, 0.3% panic disorder, 0.6% psychotic disorder, 0.1% anorexia. During the 6 months following the physician diagnosis of AD, 66% used high-potency topical corticosteroids, 3.5% used an SSRI, 0.3% used an SNRI, 1.2% used a tricyclic antidepressant, 1.4% used an antipsychotic medication, and 5.2% used an anxiolytic agent. 4.4% had an outpatient visit with a psychiatrist and 0.1% had been hospitalized with a psychiatric diagnosis. In 14-16 years olds, 4.7% had at least one claim with a recorded psychopathology during the 6-month baseline period; 3.3% had depression, 2.5% anxiety, 0.2% panic disorder, 0.5% psychotic disorder, 0.1% anorexia. During the 6 months following the physician diagnosis of AD, 68% used high-potency topical corticosteroids, 4.6% used an SSRI, 0.6% used an SNRI, 1.5% used a tricyclic antidepressant, 1.4% used an antipsychotic medication, and 4.6% used an anxiolytic agent. 4.7% had an outpatient visit with a psychiatrist and 0.1% had been hospitalized with a psychiatric diagnosis. In 12-14 years olds, 3.3% had at least one claim with a recorded psychopathology during the 6-month baseline period; 1.9% had depression, 2.2% anxiety, 0.1% panic disorder, 0.7% psychotic disorder, 0.0% anorexia. During the 6 months following the physician diagnosis of AD, 67% used high-potency topical corticosteroids, 2.1% used an SSRI, 0.1% used an SNRI, 0.7% used a tricyclic antidepressant, 0.9 % used an antipsychotic medication, and 4.1% used an anxiolytic agent. 3.8% had an outpatient visit with a psychiatrist and 0.05% had been hospitalized with a psychiatric diagnosis. In younger children psychopathologies were decreasingly common: 10-12: 2.8%; 8-10: 2.3%; 6-8: 1.3%; 4-6: 0.6%. In conclusion, there is substantial psychiatric comorbidity among children, <18 years old, with diagnosed atopic dermatitis in a US commercially insured population. Meaningful psychiatric medication use (>3%) starts as early as 12 years old.

Keywords: pediatric atopic dermatitis, phychotropic medication use, psychiatric comorbidity, claims database

Procedia PDF Downloads 163
1313 Model of Pharmacoresistant Blood-Brain Barrier In-vitro for Prediction of Transfer of Potential Antiepileptic Drugs

Authors: Emílie Kučerová, Tereza Veverková, Marina Morozovová, Eva Kudová, Jitka Viktorová

Abstract:

The blood-brain barrier (BBB) is a key element regulating the transport of substances between the blood and the central nervous system (CNS). The BBB protects the CNS from potentially harmful substances and maintains a suitable environment for nervous activity in the CNS, but at the same time, it represents a significant obstacle to the entry of drugs into the CNS. Pharmacoresistant epilepsy is a form of epilepsy that cannot be suppressed using two (or more) appropriately chosen antiepileptic drugs. In many cases, pharmacoresistant epilepsy is characterized by an increased concentration of efflux pumps on the luminal sides of the endothelial cells that form the BBB and an increased number of drug-metabolizing enzymes in the BBB cells, thereby preventing the effective transport of antiepileptic drugs into the CNS. Currently, a number of scientific groups are focusing on the preparation and improvement of BBB models in vitro in order to study cell interactions or transport mechanisms. However, in pathological conditions such as pharmacoresistant epilepsy, there are changes in BBB structure, and current BBB models are insufficient for related research. Our goal is to develop a suitable BBB model for pharmacoresistant epilepsy in vitro and use it to test the transfer of potential antiepileptic drugs. This model is created by co-culturing immortalized human cerebral microvascular endothelial cells, human vascular pericytes and immortalized human astrocytes. The BBB in vitro is cultivated in the form of a 2D transwell model and the integrity of the barrier is verified by measuring transendothelial electrical resistance (TEER). From the current results, a contact cell arrangement with the cultivation of endothelial cells on the upper side of the insert and the co-cultivation of astrocytes and pericytes on the lower side of the insert is selected as the most promising for BBB model cultivation. The pharmacoresistance of the BBB model is achieved by long-term cultivation of endothelial cells in an increasing concentration of selected antiepileptic drugs, which should lead to increased production of efflux pumps and drug-metabolizing enzymes. The pharmacoresistant BBB model in vitro will be further used for the screening of substances that could act both as antiepileptics and at the same time as inhibitors of efflux pumps in endothelial cells. This project was supported by the Technology Agency of the Czech Republic (TACR), Personalized Medicine: Translational research towards biomedical applications, No. TN02000109 and by the Academy of Sciences of the Czech Republic (AS CR) – grant RVO 61388963.

Keywords: antiepileptic drugs, blood-brain barrier, efflux transporters, pharmacoresistance

Procedia PDF Downloads 48
1312 Development and Application of an Intelligent Masonry Modulation in BIM Tools: Literature Review

Authors: Sara A. Ben Lashihar

Abstract:

The heritage building information modelling (HBIM) of the historical masonry buildings has expanded lately to meet the urgent needs for conservation and structural analysis. The masonry structures are unique features for ancient building architectures worldwide that have special cultural, spiritual, and historical significance. However, there is a research gap regarding the reliability of the HBIM modeling process of these structures. The HBIM modeling process of the masonry structures faces significant challenges due to the inherent complexity and uniqueness of their structural systems. Most of these processes are based on tracing the point clouds and rarely follow documents, archival records, or direct observation. The results of these techniques are highly abstracted models where the accuracy does not exceed LOD 200. The masonry assemblages, especially curved elements such as arches, vaults, and domes, are generally modeled with standard BIM components or in-place models, and the brick textures are graphically input. Hence, future investigation is necessary to establish a methodology to generate automatically parametric masonry components. These components are developed algorithmically according to mathematical and geometric accuracy and the validity of the survey data. The main aim of this paper is to provide a comprehensive review of the state of the art of the existing researches and papers that have been conducted on the HBIM modeling of the masonry structural elements and the latest approaches to achieve parametric models that have both the visual fidelity and high geometric accuracy. The paper reviewed more than 800 articles, proceedings papers, and book chapters focused on "HBIM and Masonry" keywords from 2017 to 2021. The studies were downloaded from well-known, trusted bibliographic databases such as Web of Science, Scopus, Dimensions, and Lens. As a starting point, a scientometric analysis was carried out using VOSViewer software. This software extracts the main keywords in these studies to retrieve the relevant works. It also calculates the strength of the relationships between these keywords. Subsequently, an in-depth qualitative review followed the studies with the highest frequency of occurrence and the strongest links with the topic, according to the VOSViewer's results. The qualitative review focused on the latest approaches and the future suggestions proposed in these researches. The findings of this paper can serve as a valuable reference for researchers, and BIM specialists, to make more accurate and reliable HBIM models for historic masonry buildings.

Keywords: HBIM, masonry, structure, modeling, automatic, approach, parametric

Procedia PDF Downloads 149
1311 Multicollinearity and MRA in Sustainability: Application of the Raise Regression

Authors: Claudia García-García, Catalina B. García-García, Román Salmerón-Gómez

Abstract:

Much economic-environmental research includes the analysis of possible interactions by using Moderated Regression Analysis (MRA), which is a specific application of multiple linear regression analysis. This methodology allows analyzing how the effect of one of the independent variables is moderated by a second independent variable by adding a cross-product term between them as an additional explanatory variable. Due to the very specification of the methodology, the moderated factor is often highly correlated with the constitutive terms. Thus, great multicollinearity problems arise. The appearance of strong multicollinearity in a model has important consequences. Inflated variances of the estimators may appear, there is a tendency to consider non-significant regressors that they probably are together with a very high coefficient of determination, incorrect signs of our coefficients may appear and also the high sensibility of the results to small changes in the dataset. Finally, the high relationship among explanatory variables implies difficulties in fixing the individual effects of each one on the model under study. These consequences shifted to the moderated analysis may imply that it is not worth including an interaction term that may be distorting the model. Thus, it is important to manage the problem with some methodology that allows for obtaining reliable results. After a review of those works that applied the MRA among the ten top journals of the field, it is clear that multicollinearity is mostly disregarded. Less than 15% of the reviewed works take into account potential multicollinearity problems. To overcome the issue, this work studies the possible application of recent methodologies to MRA. Particularly, the raised regression is analyzed. This methodology mitigates collinearity from a geometrical point of view: the collinearity problem arises because the variables under study are very close geometrically, so by separating both variables, the problem can be mitigated. Raise regression maintains the available information and modifies the problematic variables instead of deleting variables, for example. Furthermore, the global characteristics of the initial model are also maintained (sum of squared residuals, estimated variance, coefficient of determination, global significance test and prediction). The proposal is implemented to data from countries of the European Union during the last year available regarding greenhouse gas emissions, per capita GDP and a dummy variable that represents the topography of the country. The use of a dummy variable as the moderator is a special variant of MRA, sometimes called “subgroup regression analysis.” The main conclusion of this work is that applying new techniques to the field can improve in a substantial way the results of the analysis. Particularly, the use of raised regression mitigates great multicollinearity problems, so the researcher is able to rely on the interaction term when interpreting the results of a particular study.

Keywords: multicollinearity, MRA, interaction, raise

Procedia PDF Downloads 88
1310 Congenital Diaphragmatic Hernia Outcomes in a Low-Volume Center

Authors: Michael Vieth, Aric Schadler, Hubert Ballard, J. A. Bauer, Pratibha Thakkar

Abstract:

Introduction: Congenital diaphragmatic hernia (CDH) is a condition characterized by the herniation of abdominal contents into the thoracic cavity requiring postnatal surgical repair. Previous literature suggests improved CDH outcomes at high-volume regional referral centers compared to low-volume centers. The purpose of this study was to examine CDH outcomes at Kentucky Children’s Hospital (KCH), a low-volume center, compared to the Congenital Diaphragmatic Hernia Study Group (CDHSG). Methods: A retrospective chart review was performed at KCH from 2007-2019 for neonates with CDH, and then subdivided into two cohorts: those requiring ECMO therapy and those not requiring ECMO therapy. Basic demographic data and measures of mortality and morbidity including ventilator days and length of stay were compared to the CDHSG. Measures of morbidity for the ECMO cohort including duration of ECMO, clinical bleeding, intracranial hemorrhage, sepsis, need for continuous renal replacement therapy (CRRT), need for sildenafil at discharge, timing of surgical repair, and total ventilator days were collected. Statistical analysis was performed using IBM SPSS Statistics version 28. One-sample t-tests and one-sample Wilcoxon Signed Rank test were utilized as appropriate.Results: There were a total of 27 neonatal patients with CDH at KCH from 2007-2019; 9 of the 27 required ECMO therapy. The birth weight and gestational age were similar between KCH and the CDHSG (2.99 kg vs 2.92 kg, p =0.655; 37.0 weeks vs 37.4 weeks, p =0.51). About half of the patients were inborn in both cohorts (52% vs 56%, p =0.676). KCH cohort had significantly more Caucasian patients (96% vs 55%, p=<0.001). Unadjusted mortality was similar in both groups (KCH 70% vs CDHSG 72%, p =0.857). Using ECMO utilization (KCH 78% vs CDHSG 52%, p =0.118) and need for surgical repair (KCH 95% vs CDHSG 85%, p =0.060) as proxy for severity, both groups’ mortality were comparable. No significant difference was noted for pulmonary outcomes such as average ventilator days (KCH 43.2 vs. CDHSG 17.3, p =0.078) and home oxygen dependency (KCH 44% vs. CDHSG 24%, p =0.108). Average length of hospital stay for patients treated at KCH was similar to CDHSG (64.4 vs 49.2, p=1.000). Conclusion: Our study demonstrates that outcome in CDH patients is independent of center’s case volume status. Management of CDH with a standardized approach in a low-volume center can yield similar outcomes. This data supports the treatment of patients with CDH at low-volume centers as opposed to transferring to higher-volume centers.

Keywords: ECMO, case volume, congenital diaphragmatic hernia, congenital diaphragmatic hernia study group, neonate

Procedia PDF Downloads 77
1309 Structural-Lithological Conditions of Formation of Epithermal Gold Sulphide Satellite Deposits in the North Part of Chovdar Ore Area

Authors: Nabat Gojaeva, Mikayil Naghiyev, Sultan Jafarov, Gular Mikayilova

Abstract:

Chovdar ore area is located in the contact of Dashkesan caldera and Shamkir horst-graben uplift, which comprises the central part of Lok-Karabakh Island arcs of South Caucasus metallogenic province in terms of regional tectonics. One of the main structural features of formation of the Mereh and Aghyokhush group of low sulfidation epithermal gold deposits, locating in the north peripheric part of the ore area, is involving the crossing areas of ore-hosting and ore-forming Pan-Caucasian-direction structurally-compound faults with the meridional, rhombically shaped faults. In addition, another significant feature is the temporally two- or three-stage ore formation. In the first stage -an early phase of Upper Bathonian age, sulfides are the dominant minerals, in the second stage- late ‘productive’ phase of Upper Bathonian age, mainly gold mineralization is formed. Also, in the Upper Jurassic – Lower Cretaceous ages, rarely-encountered Cu-polymetallic ore formations are documented. Finally, in the last stage, the re-dislocation of ore-formation is foreseen in the previously-formed mineralization areas. The faults in the strike and dip directions formed shearing, brecciation, sulfide mineralization aureoles, and hydrothermal alteration zones in the wall rocks along with the local depression blocks. The geological-structural analysis of the area shows that multiple and various morphogenetic volcano-tectonically fault systems have developed in the area. These fault systems have played a trap role for ore-formation in the intersected parts of faults mentioned above. Thus, in the referred parts, mostly predominance of felsic volcanism and metasomatic alteration (silicification, argillitic, etc.) of wall rocks, as well as the products of this volcanism, account for the inclusion of hydrothermal ore-forming fluids along these faults. It is possible to determine temporally and lithological-structural connection between the ore-formation along with local depression blocks and faults as borders for products of felsic volcanism of Upper Cretaceous-Lesser Jurassic ages, in the results of the replacement of hydrothermal alteration zones with relatively low-temperature metasomatic alterations while moving from the felsic parts to the margins, and due to being non-ore bearing intermediate and intermediate-felsic magmatic facies.

Keywords: Aghyokhush, fault, gold deposit, Mereh

Procedia PDF Downloads 208
1308 Fresh Amnion Membrane Grafting for the Regeneration of Skin in Full Thickness Burn in Newborn - Case Report

Authors: Priyanka Yadav, Umesh Bnasal, Yashvinder Kumar

Abstract:

The placenta is an important structure that provides oxygen and nutrients to the growing fetus in utero. It is usually thrown away after birth, but it has a therapeutic role in the regeneration of tissue. It is covered by the amniotic membrane, which can be easily separated into the amnion layer and the chorion layer—the amnion layer act as a biofilm for the healing of burn wound and non-healing ulcers. The freshly collected membrane has stem cells, cytokines, growth factors, and anti-inflammatory properties, which act as a biofilm for the healing of wounds. It functions as a barrier and prevents heat and water loss and also protects from bacterial contamination, thus supporting the healing process. The application of Amnion membranes has been successfully used for wound and reconstructive purposes for decades. It is a very cheap and easy process and has shown superior results to allograft and xenograft. However, there are very few case reports of amnion membrane grafting in newborns; we intend to highlight its therapeutic importance in burn injuries in newborns. We present a case of 9 days old male neonate who presented to the neonatal unit of Maulana Azad Medical College with a complaint of fluid-filled blisters and burns wound on the body for six days. He was born outside the hospital at 38 weeks of gestation to a 24-year-old primigravida mother by vaginal delivery. The presentation was cephalic and the amniotic fluid was clear. His birth weight was 2800 gm and APGAR scores were 7 and 8 at 1 and 5 minutes, respectively. His anthropometry was appropriate for gestational age. He developed respiratory distress after birth requiring oxygen support by nasal prongs for three days. On the day of life three, he developed blisters on his body, starting from than face then over the back and perineal region. At a presentation on the day of life nine, he had blisters and necrotic wound on the right side of the face, back, right shoulder and genitalia, affecting 60% of body surface area with full-thickness loss of skin. He was started on intravenous antibiotics and fluid therapy. Pus culture grew Pseudomonas aeuroginosa, for which culture-specific antibiotics were started. Plastic surgery reference was taken and regular wound dressing was done with antiseptics. He had a storming course during the hospital stay. On the day of life 35 when the baby was hemodynamically stable, amnion membrane grafting was done on the wound site; for the grafting, fresh amnion membrane was removed under sterile conditions from the placenta obtained by caesarean section. It was then transported to the plastic surgery unit in half an hour in a sterile fluid where the graft was applied over the infant’s wound. The amnion membrane grafting was done twice in two weeks for covering the whole wound area. After successful uptake of amnion membrane, skin from the thigh region was autografted over the whole wound area by Meek technique in a single setting. The uptake of autograft was excellent and most of the areas were healed. In some areas, there was patchy regeneration of skin so dressing was continued. The infant was discharged after three months of hospital stay and was later followed up in the plastic surgery unit of the hospital.

Keywords: amnion membrane grafting, autograft, meek technique, newborn, regeneration of skin

Procedia PDF Downloads 152
1307 Real-Time Quantitative Polymerase Chain Reaction Assay for the Detection of microRNAs Using Bi-Directional Extension Sequences

Authors: Kyung Jin Kim, Jiwon Kwak, Jae-Hoon Lee, Soo Suk Lee

Abstract:

MicroRNAs (miRNA) are a class of endogenous, single-stranded, small, and non-protein coding RNA molecules typically 20-25 nucleotides long. They are thought to regulate the expression of other genes in a broad range by binding to 3’- untranslated regions (3’-UTRs) of specific mRNAs. The detection of miRNAs is very important for understanding of the function of these molecules and in the diagnosis of variety of human diseases. However, detection of miRNAs is very challenging because of their short length and high sequence similarities within miRNA families. So, a simple-to-use, low-cost, and highly sensitive method for the detection of miRNAs is desirable. In this study, we demonstrate a novel bi-directional extension (BDE) assay. In the first step, a specific linear RT primer is hybridized to 6-10 base pairs from the 3’-end of a target miRNA molecule and then reverse transcribed to generate a cDNA strand. After reverse transcription, the cDNA was hybridized to the 3’-end which is BDE sequence; it played role as the PCR template. The PCR template was amplified in an SYBR green-based quantitative real-time PCR. To prove the concept, we used human brain total RNA. It could be detected quantitatively in the range of seven orders of magnitude with excellent linearity and reproducibility. To evaluate the performance of BDE assay, we contrasted sensitivity and specificity of the BDE assay against a commercially available poly (A) tailing method using miRNAs for let-7e extracted from A549 human epithelial lung cancer cells. The BDE assay displayed good performance compared with a poly (A) tailing method in terms of specificity and sensitivity; the CT values differed by 2.5 and the melting curve showed a sharper than poly (A) tailing methods. We have demonstrated an innovative, cost-effective BDE assay that allows improved sensitivity and specificity in detection of miRNAs. Dynamic range of the SYBR green-based RT-qPCR for miR-145 could be represented quantitatively over a range of 7 orders of magnitude from 0.1 pg to 1.0 μg of human brain total RNA. Finally, the BDE assay for detection of miRNA species such as let-7e shows good performance compared with a poly (A) tailing method in terms of specificity and sensitivity. Thus BDE proves a simple, low cost, and highly sensitive assay for various miRNAs and should provide significant contributions in research on miRNA biology and application of disease diagnostics with miRNAs as targets.

Keywords: bi-directional extension (BDE), microRNA (miRNA), poly (A) tailing assay, reverse transcription, RT-qPCR

Procedia PDF Downloads 150
1306 Method of Complex Estimation of Text Perusal and Indicators of Reading Quality in Different Types of Commercials

Authors: Victor N. Anisimov, Lyubov A. Boyko, Yazgul R. Almukhametova, Natalia V. Galkina, Alexander V. Latanov

Abstract:

Modern commercials presented on billboards, TV and on the Internet contain a lot of information about the product or service in text form. However, this information cannot always be perceived and understood by consumers. Typical sociological focus group studies often cannot reveal important features of the interpretation and understanding information that has been read in text messages. In addition, there is no reliable method to determine the degree of understanding of the information contained in a text. Only the fact of viewing a text does not mean that consumer has perceived and understood the meaning of this text. At the same time, the tools based on marketing analysis allow only to indirectly estimate the process of reading and understanding a text. Therefore, the aim of this work is to develop a valid method of recording objective indicators in real time for assessing the fact of reading and the degree of text comprehension. Psychophysiological parameters recorded during text reading can form the basis for this objective method. We studied the relationship between multimodal psychophysiological parameters and the process of text comprehension during reading using the method of correlation analysis. We used eye-tracking technology to record eye movements parameters to estimate visual attention, electroencephalography (EEG) to assess cognitive load and polygraphic indicators (skin-galvanic reaction, SGR) that reflect the emotional state of the respondent during text reading. We revealed reliable interrelations between perceiving the information and the dynamics of psychophysiological parameters during reading the text in commercials. Eye movement parameters reflected the difficulties arising in respondents during perceiving ambiguous parts of text. EEG dynamics in rate of alpha band were related with cumulative effect of cognitive load. SGR dynamics were related with emotional state of the respondent and with the meaning of text and type of commercial. EEG and polygraph parameters together also reflected the mental difficulties of respondents in understanding text and showed significant differences in cases of low and high text comprehension. We also revealed differences in psychophysiological parameters for different type of commercials (static vs. video, financial vs. cinema vs. pharmaceutics vs. mobile communication, etc.). Conclusions: Our methodology allows to perform multimodal evaluation of text perusal and the quality of text reading in commercials. In general, our results indicate the possibility of designing an integral model to estimate the comprehension of reading the commercial text in percent scale based on all noticed markers.

Keywords: reading, commercials, eye movements, EEG, polygraphic indicators

Procedia PDF Downloads 148
1305 Assessing Acceptability and Preference of Printed Posters on COVID-19 Related Stigma: A Post-Test Study Among HIV-Focused Health Workers in Greater Accra Region of Ghana

Authors: Jerry Fiave, Dacosta Aboagye, Stephen Ayisi-Addo, Mabel Kissiwah Asafo, Felix Osei-Sarpong, Ebenezer Kye-Mensah, Renee Opare-Otoo

Abstract:

Background: Acceptability and preference of social and behaviour change (SBC) materials by target audiences is an important determinant of effective health communication outcomes. In Ghana, however, pre-test and post-test studies on acceptability and preference of specific SBC materials for specific audiences are rare. The aim of this study was therefore to assess the acceptability and preference of printed posters on COVID-19 related stigma as suitable SBC materials for health workers to influence behaviours that promote uptake of HIV-focused services. Methods: A total of 218 health workers who provide HIV-focused services were purposively sampled in 16 polyclinics where the posters were distributed in the Greater Accra region of Ghana. Data was collected in March 2021 using an adapted self-administered questionnaire in Google forms deployed via WhatsApp to participants. The data were imported into SPSS version 27 where chi-square test and regression analyses were performed to establish association as well as strength of association between variables respectively. Results: A total of 142 participants (physicians, nurses, midwives, lab scientists, health promoters, diseases control officers) made up of 85(60%) females and 57(40%) males responded to the questionnaire, giving a response rate of 65.14%. Only 88 (61.97%) of the respondents were exposed to the posters. The majority of those exposed said the posters were informative [82(93.18%)], relevant [85(96.59%)] and attractive [83(94.32%)]. They [82(93.20%)] also rated the material as acceptable with no statistically significant association between category of health worker and acceptability of the posters (X =1.631, df=5, p=0.898). However, participants’ most preferred forms of material on COVID-19 related stigma were social media [38(26.76%)], television [33(23.24%)], SMS [19(13.38%)], and radio [18(12.70%)]. Clinical health workers were 4.88 times more likely to prefer online or electronic versions of SBC materials than nonclinical health workers [AOR= 4.88 (95% CI= 0.31-0.98), p=0.034]. Conclusions: Printed posters on COVID-19 related stigma are acceptable SBC materials in communicating behaviour change messages that target health workers in promoting uptake of HIV-focused services. Posters are however, not among the most preferred materials for health workers. It is therefore recommended that material assessment studies are conducted to inform the development of acceptable and preferred materials for target audiences.

Keywords: acceptability, AIDS, HIV, posters, preference, SBC, stigma, social and behaviour change communication

Procedia PDF Downloads 80
1304 Boiler Ash as a Reducer of Formaldehyde Emission in Medium-Density Fiberboard

Authors: Alexsandro Bayestorff da Cunha, Dpebora Caline de Mello, Camila Alves Corrêa

Abstract:

In the production of fiberboards, an adhesive based on urea-formaldehyde resin is used, which has the advantages of low cost, homogeneity of distribution, solubility in water, high reactivity in an acid medium, and high adhesion to wood. On the other hand, as a disadvantage, there is low resistance to humidity and the release of formaldehyde. The objective of the study was to determine the viability of adding industrial boiler ash to the urea formaldehyde-based adhesive for the production of medium-density fiberboard. The raw material used was composed of Pinus spp fibers, urea-formaldehyde resin, paraffin emulsion, ammonium sulfate, and boiler ash. The experimental plan, consisting of 8 treatments, was completely randomized with a factorial arrangement, with 0%, 1%, 3%, and 5% ash added to the adhesive, with and without the application of a catalyst. In each treatment, 4 panels were produced with density of 750 kg.m⁻³, dimensions of 40 x 40 x 1,5 cm, 12% urea formaldehyde resin, 1% paraffin emulsion and hot pressing at a temperature of 180ºC, the pressure of 40 kgf/cm⁻² for a time of 10 minutes. The different compositions of the adhesive were characterized in terms of viscosity, pH, gel time and solids, and the panels by physical and mechanical properties, in addition to evaluation using the IMAL DPX300 X-ray densitometer and formaldehyde emission by the perforator method. The results showed a significant reduction of all adhesive properties with the use of the catalyst, regardless of the treatment; while the percentage increase of ashes provided an increase in the average values of viscosity, gel time, and solids and a reduction in pH for the panels with a catalyst; for panels without catalyst, the behavior was the opposite, with the exception of solids. For the physical properties, the results of the variables of density, compaction ratio, and thickness were equivalent and in accordance with the standard, while the moisture content was significantly reduced with the use of the catalyst but without the influence of the percentage of ash. The density profile for all treatments was characteristic of medium-density fiberboard, with more compacted and dense surfaces when compared to the central layer. For thickness, the swelling was not influenced by the catalyst and the use of ash, presenting average values within the normalized parameters. For mechanical properties, the influence of ashes on the adhesive was negatively observed in the modulus of rupture from 1% and in the traction test from 3%; however, only this last property, in the percentages of 3% and 5%, were below the minimum limit of the norm. The use of catalyst and ashes with percentages of 3% and 5% reduced the formaldehyde emission of the panels; however, only the panels that used adhesive with catalyst presented emissions below 8mg of formaldehyde / 100g of the panel. In this way, it can be said that boiler ash can be added to the adhesive with a catalyst without impairing the technological properties by up to 1%.

Keywords: reconstituted wood panels, formaldehyde emission, technological properties of panels, perforator

Procedia PDF Downloads 51
1303 Design Thinking and Project-Based Learning: Opportunities, Challenges, and Possibilities

Authors: Shoba Rathilal

Abstract:

High unemployment rates and a shortage of experienced and qualified employees appear to be a paradox that currently plagues most countries worldwide. In a developing country like South Africa, the rate of unemployment is reported to be approximately 35%, the highest recorded globally. At the same time, a countrywide deficit in experienced and qualified potential employees is reported in South Africa, which is causing fierce rivalry among firms. Employers have reported that graduates are very rarely able to meet the demands of the job as there are gaps in their knowledge and conceptual understanding and other 21st-century competencies, attributes, and dispositions required to successfully negotiate the multiple responsibilities of employees in organizations. In addition, the rates of unemployment and suitability of graduates appear to be skewed by race and social class, the continued effects of a legacy of inequitable educational access. Higher Education in the current technologically advanced and dynamic world needs to serve as an agent of transformation, aspiring to develop graduates to be creative, flexible, critical, and with entrepreneurial acumen. This requires that higher education curricula and pedagogy require a re-envisioning of our selection, sequencing, and pacing of the learning, teaching, and assessment. At a particular Higher education Institution in South Africa, Design Thinking and Project Based learning are being adopted as two approaches that aim to enhance the student experience through the provision of a “distinctive education” that brings together disciplinary knowledge, professional engagement, technology, innovation, and entrepreneurship. Using these methodologies forces the students to solve real-time applied problems using various forms of knowledge and finding innovative solutions that can result in new products and services. The intention is to promote the development of skills for self-directed learning, facilitate the development of self-awareness, and contribute to students being active partners in the application and production of knowledge. These approaches emphasize active and collaborative learning, teamwork, conflict resolution, and problem-solving through effective integration of theory and practice. In principle, both these approaches are extremely impactful. However, at the institution in this study, the implementation of the PBL and DT was not as “smooth” as anticipated. This presentation reports on the analysis of the implementation of these two approaches within higher education curricula at a particular university in South Africa. The study adopts a qualitative case study design. Data were generated through the use of surveys, evaluation feedback at workshops, and content analysis of project reports. Data were analyzed using document analysis, content, and thematic analysis. Initial analysis shows that the forces constraining the implementation of PBL and DT range from the capacity to engage with DT and PBL, both from staff and students, educational contextual realities of higher education institutions, administrative processes, and resources. At the same time, the implementation of DT and PBL was enabled through the allocation of strategic funding and capacity development workshops. These factors, however, could not achieve maximum impact. In addition, the presentation will include recommendations on how DT and PBL could be adapted for differing contexts will be explored.

Keywords: design thinking, project based learning, innovative higher education pedagogy, student and staff capacity development

Procedia PDF Downloads 60
1302 The Effects of Collaborative Videogame Play on Flow Experience and Mood

Authors: Eva Nolan, Timothy Mcnichols

Abstract:

Gamers spend over 3 billion hours collectively playing video games a week, which is arguably not nearly enough time to indulge in the many benefits gaming has to offer. Much of the previous research on video gaming is centered on the effects of playing violent video games and the negative impacts they have on the individual. However, there is a dearth of research in the area of non-violent video games, specifically the emotional and cognitive benefits playing non-violent games can offer individuals. Current research in the area of video game play suggests there are many benefits to playing for an individual, such as decreasing symptoms of depression, decreasing stress, increasing positive emotions, inducing relaxation, decreasing anxiety, and particularly improving mood. One suggestion as to why video games may offer such benefits is that they possess ideal characteristics to create and maintain flow experiences, which in turn, is the subjective experience where an individual obtains a heightened and improved state of mind while they are engaged in a task where a balance of challenge and skill is found. Many video games offer a platform for collaborative gameplay, which can enhance the emotional experience of gaming through the feeling of social support and social inclusion. The present study was designed to examine the effects of collaborative gameplay and flow experience on participants’ perceived mood. To investigate this phenomenon, an in-between subjects design involving forty participants were randomly divided into two groups where they engaged in solo or collaborative gameplay. Each group represented an even number of frequent gamers and non-frequent gamers. Each participant played ‘The Lego Movie Videogame’ on the Playstation 4 console. The participant’s levels of flow experience and perceived mood were measured by the Flow State Scale (FSS) and the Positive and Negative Affect Schedule (PANAS). The following research hypotheses were investigated: (i.) participants in the collaborative gameplay condition will experience higher levels of flow experience and higher levels of mood than those in the solo gameplay condition; (ii.) participants who are frequent gamers will experience higher levels of flow experience and higher levels of mood than non-frequent gamers; and (iii.) there will be a significant positive relationship between flow experience and mood. If the estimated findings are supported, this suggests that engaging in collaborative gameplay can be beneficial for an individual’s mood and that experiencing a state of flow can also enhance an individual’s mood. Hence, collaborative gaming can be beneficial to promote positive emotions (higher levels of mood) through engaging an individual’s flow state.

Keywords: collaborative gameplay, flow experience, mood, games, positive emotions

Procedia PDF Downloads 319
1301 Anaerobic Co-Digestion of Pressmud with Bagasse and Animal Waste for Biogas Production Potential

Authors: Samita Sondhi, Sachin Kumar, Chirag Chopra

Abstract:

The increase in population has resulted in an excessive feedstock production, which has in return lead to the accumulation of a large amount of waste from different resources as crop residues, industrial waste and solid municipal waste. This situation has raised the problem of waste disposal in present days. A parallel problem of depletion of natural fossil fuel resources has led to the formation of alternative sources of energy from the waste of different industries to concurrently resolve the two issues. The biogas is a carbon neutral fuel which has applications in transportation, heating and power generation. India is a nation that has an agriculture-based economy and agro-residues are a significant source of organic waste. Taking into account, the second largest agro-based industry that is sugarcane industry producing a high quantity of sugar and sugarcane waste byproducts such as Bagasse, Press Mud, Vinasse and Wastewater. Currently, there are not such efficient disposal methods adopted at large scales. According to manageability objectives, anaerobic digestion can be considered as a method to treat organic wastes. Press mud is lignocellulosic biomass and cannot be accumulated for Mono digestion because of its complexity. Prior investigations indicated that it has a potential for production of biogas. But because of its biological and elemental complexity, Mono-digestion was not successful. Due to the imbalance in the C/N ratio and presence of wax in it can be utilized with any other fibrous material hence will be digested properly under suitable conditions. In the first batch of Mono-digestion of Pressmud biogas production was low. Now, co-digestion of Pressmud with Bagasse which has desired C/N ratio will be performed to optimize the ratio for maximum biogas from Press mud. In addition, with respect to supportability, the main considerations are the monetary estimation of item result and ecological concerns. The work is designed in such a way that the waste from the sugar industry will be digested for maximum biogas generation and digestive after digestion will be characterized for its use as a bio-fertilizer for soil conditioning. Due to effectiveness demonstrated by studied setups of Mono-digestion and Co-digestion, this approach can be considered as a viable alternative for lignocellulosic waste disposal and in agricultural applications. Biogas produced from the Pressmud either can be used for Powerhouses or transportation. In addition, the work initiated towards the development of waste disposal for energy production will demonstrate balanced economy sustainability of the process development.

Keywords: anaerobic digestion, carbon neutral fuel, press mud, lignocellulosic biomass

Procedia PDF Downloads 156
1300 Optimization of Geometric Parameters of Microfluidic Channels for Flow-Based Studies

Authors: Parth Gupta, Ujjawal Singh, Shashank Kumar, Mansi Chandra, Arnab Sarkar

Abstract:

Microfluidic devices have emerged as indispensable tools across various scientific disciplines, offering precise control and manipulation of fluids at the microscale. Their efficacy in flow-based research, spanning engineering, chemistry, and biology, relies heavily on the geometric design of microfluidic channels. This work introduces a novel approach to optimise these channels through Response Surface Methodology (RSM), departing from the conventional practice of addressing one parameter at a time. Traditionally, optimising microfluidic channels involved isolated adjustments to individual parameters, limiting the comprehensive understanding of their combined effects. In contrast, our approach considers the simultaneous impact of multiple parameters, employing RSM to efficiently explore the complex design space. The outcome is an innovative microfluidic channel that consumes an optimal sample volume and minimises flow time, enhancing overall efficiency. The relevance of geometric parameter optimization in microfluidic channels extends significantly in biomedical engineering. The flow characteristics of porous materials within these channels depend on many factors, including fluid viscosity, environmental conditions (such as temperature and humidity), and specific design parameters like sample volume, channel width, channel length, and substrate porosity. This intricate interplay directly influences the performance and efficacy of microfluidic devices, which, if not optimized, can lead to increased costs and errors in disease testing and analysis. In the context of biomedical applications, the proposed approach addresses the critical need for precision in fluid flow. it mitigate manufacturing costs associated with trial-and-error methodologies by optimising multiple geometric parameters concurrently. The resulting microfluidic channels offer enhanced performance and contribute to a streamlined, cost-effective process for testing and analyzing diseases. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing.

Keywords: microfluidic device, minitab, statistical optimization, response surface methodology

Procedia PDF Downloads 39
1299 Leptospira Lipl32-Specific Antibodies: Therapeutic Property, Epitopes Characterization and Molecular Mechanisms of Neutralization

Authors: Santi Maneewatchararangsri, Wanpen Chaicumpa, Patcharin Saengjaruk, Urai Chaisri

Abstract:

Leptospirosis is a globally neglected disease that continues to be a significant public health and veterinary burden, with millions of cases reported each year. Early and accurate differential diagnosis of leptospirosis from other febrile illnesses and the development of a broad spectrum of leptospirosis vaccines are needed. The LipL32 outer membrane lipoprotein is a member of Leptospira adhesive matrices and has been found to exert hemolytic activity to erythrocytes in vitro. Therefore, LipL32 is regarded as a potential target for diagnosis, broad-spectrum leptospirosis vaccines, and for passive immunotherapy. In this study, we established LipL32-specific mouse monoclonal antibodies, mAbLPF1 and mAbLPF2, and their respective mouse- and humanized-engineered single chain variable fragment (ScFv). Their antibodies’ neutralizing activities against Leptospira-mediated hemolysis in vitro, and the therapeutic efficacy of mAbs against heterologous Leptospira infected hamsters were demonstrated. The epitope peptide of mAb LPF1 was mapped to a non-contiguous carboxy-terminal β-turn and amphipathic α-helix of LipL32 structure contributing to phospholipid/host cell adhesion and membrane insertion. We found that the mAbLPF2 epitope was located on the interacting loop of peptide binding groove of the LipL32 molecule responsible for interactions with host constituents. Epitope sequences are highly conserved among Leptospira spp. and are absent from the LipL32 superfamily of other microorganisms. Both epitopes are surface-exposed, readily accessible by mAbs, and immunogenic. However, they are less dominant when revealed by LipL32-specific immunoglobulins from leptospirosis-patient sera and rabbit hyperimmune serum raised by whole Leptospira. Our study also demonstrated an adhesion inhibitory activity of LipL32 protein to host membrane components and cells mediated by mAbs as well as an anti-hemolytic activity of the respective antibodies. The therapeutic antibodies, particularly the humanized-ScFv, have a potential for further development as non-drug therapeutic agent for human leptospirosis, especially in subjects allergic to antibiotics. The epitope peptides recognized by two therapeutic mAbs have potential use as tools for structure-function studies. Finally, protective peptides may be used as a target for epitope-based vaccines for control of leptospirosis.

Keywords: leptospira lipl32-specific antibodies, therapeutic epitopes, epitopes characterization, immunotherapy

Procedia PDF Downloads 283
1298 Biology and Life Fertility of the Cabbage Aphid, Brevicoryne brassicae (L) on Cauliflower Cultivars

Authors: Mandeep Kaur, K. C. Sharma, P. L. Sharma, R. S. Chandel

Abstract:

Cauliflower is an important vegetable crop grown throughout the world and is attacked by a large number of insect pests at various stages of the crop growth. Amongst them, the cabbage aphid, Brevicoryne brassicae (Linnaeus) (Hemiptera: Aphididae) is an important insect pest. Continued feeding by both nymphs and adults of this aphid causes yellowing, wilting and stunting of plants. Amongst various management practices, the use of resistant cultivars is important and can be an effective method of reducing the population of this aphid. So it is imperative to know the complete record on various biological parameters and life table on specific cultivars. The biology and life fertility of the cabbage aphid were studied on five cauliflower cultivars viz. Megha, Shweta, K-1, PSB-1 and PSBK-25 under controlled temperature conditions of 20 ± 2°C, 70 ± 5% relative humidity and 16:8 h (Light: Dark) photoperiods. For studying biology; apterous viviparous adults were picked up from the laboratory culture of all five cauliflower cultivars after rearing them at least for two generations and placed individually on the desired plants of cauliflower cultivars grown in pots with ten replicates of each. Daily record on the duration of nymphal period, adult longevity, mortality in each stage and the total number of progeny produced per female was made. This biological data were further used to construct life fertility table on each cultivar. Statistical analysis showed that there was a significant difference ( P  < 0.05) between the different growth stages and the mean number of laid nymphs. The maximum and minimum growth periods were observed on Shweta and Megha (at par with K-1) cultivars, respectively. The maximum number of nymphs were laid on Shweta cultivar (26.40 nymphs per female) and minimum on Megha (at par with K-1) cultivar (15.20 nymphs per female). The true intrinsic rate of increase (rm) was found to be maximum on Shweta (0.233 nymphs/female/day) followed by PSB K-25 (0.207 nymphs/female/day), PSB-1 (0.203 nymphs/female/day), Megha (0.166 nymphs/female/day) and K-1 (0.153 nymphs/female/day). The finite rate of natural increase (λ) was also found to be in the order: K-1 < Megha < PSB-1 < PSBK-25 < Shweta whereas the doubling time (DT) was in the order of K-1 >Megha> PSB-1 >PSBk-25> Shweta. The aphids reared on the K-1 cultivar had the lowest values of rm & λ and the highest value of DT whereas on Shweta cultivar the values of rm & λ were the highest and the lowest value of DT. So on the basis of these studies, K-1 cultivar was found to be the least suitable and the Shweta cultivar was the most suitable for the cabbage aphid population growth. Although the cauliflower cultivars used in different parts of the world may be different yet the results of the present studies indicated that the application of cultivars affecting multiplication rate and reproductive parameters could be a good solution for the management of the cabbage aphid.

Keywords: biology, cauliflower, cultivars, fertility

Procedia PDF Downloads 168
1297 Secondary Prisonization and Mental Health: A Comparative Study with Elderly Parents of Prisoners Incarcerated in Remote Jails

Authors: Luixa Reizabal, Inaki Garcia, Eneko Sansinenea, Ainize Sarrionandia, Karmele Lopez De Ipina, Elsa Fernandez

Abstract:

Although the effects of incarceration in prisons close to prisoners’ and their families’ residences have been studied, little is known about the effects of remote incarceration. The present study shows the impact of secondary prisonization on mental health of elderly parents of Basque prisoners who are incarcerated in prisons located far away from prisoners’ and their families’ residences. Secondary prisonization refers to the effects that imprisonment of a family member has on relatives. In the study, psychological effects are analyzed by means of comparative methodology. Specifically, levels of psychopathology (depression, anxiety, and stress) and positive mental health (psychological, social, and emotional well-being) are studied in a sample of parents over 65 years old of prisoners incarcerated in prisons located a long distance away (concretely, some of them in a distance of less than 400 km, while others farther than 400 km) from the Basque Country. The dataset consists of data collected through a questionnaire and from a spontaneous speech recording. The statistical and automatic analyses show that levels of psychopathology and positive mental health of elderly parents of prisoners incarcerated in remote jails are affected by the incarceration of their sons or daughters. Concretely, these parents show higher levels of depression, anxiety, and stress and lower levels of emotional (but not psychological or social) wellbeing than parents with no imprisoned daughters or sons. These findings suggest that parents with imprisoned sons or daughters suffer the impact of secondary prisonization on their mental health. When comparing parents with sons or daughters incarcerated within 400 kilometers from home and parents whose sons or daughters are incarcerated farther than 400 kilometers from home, the latter present higher levels of psychopathology, but also higher levels of positive mental health (although the difference between the two groups is not statistically significant). These findings might be explained by resilience. In fact, in traumatic situations, people can develop a force to cope with the situation, and even present a posttraumatic growth. Bearing in mind all these findings, it could be concluded that secondary prisonization implies for elderly parents with sons or daughters incarcerated in remote jails suffering and, in consequence, that changes in the penitentiary policy applied to Basque prisoners are required in order to finish this suffering.

Keywords: automatic spontaneous speech analysis, elderly parents, machine learning, positive mental health, psychopathology, remote incarceration, secondary prisonization

Procedia PDF Downloads 264
1296 Standardized Testing of Filter Systems regarding Their Separation Efficiency in Terms of Allergenic Particles and Airborne Germs

Authors: Johannes Mertl

Abstract:

Our surrounding air contains various particles. Besides typical representatives of inorganic dust, such as soot and ash, also particles originating from animals, microorganisms or plants are floating through the air, so-called bioaerosols. The group of bioaerosols consists of a broad spectrum of particles of different size, including fungi, bacteria, viruses, spores, or tree, flower and grass pollen that are of high relevance for allergy sufferers. In dependence of the environmental climate and the actual season, these allergenic particles can be found in enormous numbers in the air and are inhaled by humans via the respiration tract, with a potential for inflammatory diseases of the airways, such as asthma or allergic rhinitis. As a consequence air filter systems of ventilation and air conditioning devices are required to meet very high standards to prevent, or at least lower the number of allergens and airborne germs entering the indoor air. Still, filter systems are merely classified for their separation rates using well-defined mineral test dust, while no appropriate sufficiently standardized test methods for bioaerosols exist. However, determined separation rates for mineral test particles of a certain size cannot simply be transferred to bioaerosols, as separation efficiency of particularly fine and respirable particles (< 10 microns) is dependent not only on their shape and particle diameter, but also defined by their density and physicochemical properties. For this reason, the OFI developed a test method, which directly enables a testing of filters and filter media for their separation rates on bioaerosols, as well as a classification of filters. Besides allergens from an intact or fractured tree or grass pollen, allergenic proteins bound to particulates, as well as allergenic fungal spores (e.g. Cladosporium cladosporioides), or bacteria can be used to classify filters regarding their separation rates. Allergens passing through the filter can then be detected by highly sensitive immunological assays (ELISA) or in the case of fungal spores by microbiological methods, which allow for the detection of even one single spore passing the filter. The test procedure, which is carried out in laboratory scale, was furthermore validated regarding its sufficiency to cover real life situations by upscaling using air conditioning devices showing great conformity in terms of separation rates. Additionally, a clinical study with allergy sufferers was performed to verify analytical results. Several different air conditioning filters from the car industry have been tested, showing significant differences in their separation rates.

Keywords: airborne germs, allergens, classification of filters, fine dust

Procedia PDF Downloads 241
1295 The Relationship between Violence against Women and Levels of Self-Esteem in Urban Informal Settlements of Mumbai, India: A Cross-Sectional Study

Authors: A. Bentley, A. Prost, N. Daruwalla, D. Osrin

Abstract:

Background: This study aims to investigate the relationship between experiences of violence against women in the family, and levels of self-esteem in women residing in informal settlement (slum) areas of Mumbai, India. The authors hypothesise that violence against women in Indian households extends beyond that of intimate partner violence (IPV), to include other members of the family and that experiences of violence are associated with lower levels of self-esteem. Methods: Experiences of violence were assessed through a cross-sectional survey of 598 women, including questions about specific acts of emotional, economic, physical and sexual violence across different time points, and the main perpetrator of each. Self-esteem was assessed using the Rosenberg self-esteem questionnaire. A global score for self-esteem was calculated and the relationship between violence in the past year and Rosenberg self-esteem score was assessed using multivariable linear regression models, adjusted for years of education completed, and clustering using robust standard errors. Results: 482 (81%) women consented to interview. On average, they were 28.5 years old, had completed 6 years of education and had been married 9.5 years. 88% were Muslim and 46% lived in joint families. 44% of women had experienced at least one act of violence in their lifetime (33% emotional, 22% economic, 24% physical, 12% sexual). Of the women who experienced violence after marriage, 70% cited a perpetrator other than the husband for at least one of the acts. 5% had low self-esteem (Rosenberg score < 15). For women who experienced emotional violence in the past year, the Rosenberg score was 2.6 points lower (p < 0.001). It was 1.2 points lower (p = 0.03) for women who experienced economic violence. For physical or sexual violence in the past year, no statistically significant relationship with Rosenberg score was seen. However, for a one-unit increase in the number of different acts of each type of violence experienced in the past year, a decrease in Rosenberg score was seen (-0.62 for emotional, -0.76 for economic, -0.53 for physical and -0.47 for sexual; p < 0.05 for all). Discussion: The high prevalence of violence experiences across the lifetime was likely due to the detailed assessment of violence and the inclusion of perpetrators within the family other than the husband. Experiences of emotional or economic violence in the past year were associated with lower Rosenberg scores and therefore lower self-esteem, but no relationship was seen between experiences of physical or sexual violence and Rosenberg score overall. For all types of violence in the past year, a greater number of different acts were associated with a decrease in Rosenberg score. Emotional violence showed the strongest relationship with self-esteem, but for all types of violence the more complex the pattern of perpetration with different methods used, the lower the levels of self-esteem. Due to the cross-sectional nature of the study causal directionality cannot be attributed. Further work to investigate the relationship between severity of violence and self-esteem and whether self-esteem mediates relationships between violence and poorer mental health would be beneficial.

Keywords: family violence, India, informal settlements, Rosenberg self-esteem scale, self-esteem, violence against women

Procedia PDF Downloads 114
1294 Chemical Fabrication of Gold Nanorings: Controlled Reduction and Optical Tuning for Nanomedicine Applications

Authors: Mehrnaz Mostafavi, Jalaledin Ghanavi

Abstract:

This research investigates the production of nanoring structures through a chemical reduction approach, exploring gradual reduction processes assisted by reductant agents, leading to the formation of these specialized nanorings. The study focuses on the controlled reduction of metal atoms within these agents, crucial for shaping these nanoring structures over time. The paper commences by highlighting the wide-ranging applications of metal nanostructures across fields like Nanomedicine, Nanobiotechnology, and advanced spectroscopy methods such as Surface Enhanced Raman Spectroscopy (SERS) and Surface Enhanced Infrared Absorption Spectroscopy (SEIRA). Particularly, gold nanoparticles, especially in the nanoring configuration, have gained significant attention due to their distinctive properties, offering accessible spaces suitable for sensing and spectroscopic applications. The methodology involves utilizing human serum albumin as a reducing agent to create gold nanoparticles through a chemical reduction process. This process involves the transfer of electrons from albumin's carboxylic groups, converting them into carbonyl, while AuCl4− acquires electrons to form gold nanoparticles. Various characterization techniques like Ultraviolet–visible spectroscopy (UV-Vis), Atomic-force microscopy (AFM), and Transmission electron microscopy (TEM) were employed to examine and validate the creation and properties of the gold nanoparticles and nanorings. The findings suggest that precise and gradual reduction processes, in conjunction with optimal pH conditions, play a pivotal role in generating nanoring structures. Experiments manipulating optical properties revealed distinct responses in the visible and infrared spectrums, demonstrating the tunability of these nanorings. Detailed examinations of the morphology confirmed the formation of gold nanorings, elucidating their size, distribution, and structural characteristics. These nanorings, characterized by an empty volume enclosed by uniform walls, exhibit promising potential in the realms of Nanomedicine and Nanobiotechnology. In summary, this study presents a chemical synthesis approach using organic reducing agents to produce gold nanorings. The results underscore the significance of controlled and gradual reduction processes in crafting nanoring structures with unique optical traits, offering considerable value across diverse nanotechnological applications.

Keywords: nanoring structures, chemical reduction approach, gold nanoparticles, spectroscopy methods, nano medicine applications

Procedia PDF Downloads 94