Search results for: 2)al harouge al aswad igneous complex.
1686 Exploring Attachment Mechanisms of Sulfate-Reducing Bacteria Biofilm to X52 Carbon Steel and Effective Mitigation Through Moringa Oleifera Extract
Authors: Hadjer Didouh, Mohammed Hadj Melliani, Izzeddine Sameut Bouhaik
Abstract:
Corrosion is a serious problem in industrial installations or metallic transport pipes. Corrosion is an interfacial process controlled by several parameters. The presence of microorganisms affects the kinetics of corrosion. This type of corrosion is often referred to as bio-corrosion or corrosion influenced by microorganisms (MIC). The action of a microorganism or a bacterium is carried out by the formation of biofilm following its attachment to the metal surface. The formation of biofilm isolates the metal surface from its environment and allows the bacteria to control the parameters of the metal/bacteria interface. Biofilm formation by sulfate-reducing bacteria (SRB) X52 steel poses substantial challenges in the oil and gas industry SONATRACH of Algeria. This research delves into the complex attachment mechanisms employed by SRB biofilm on X52 carbon steel and investigates innovative strategies for effective mitigation using biocides. The exploration commences by elucidating the underlying mechanisms facilitating SRB biofilm adhesion to X52 carbon steel, considering factors such as surface morphology, electrostatic interactions, and microbial extracellular substances. Advanced microscopy and spectroscopic techniques provide support to the attachment processes, laying the foundation for targeted mitigation strategies. The use of 100 ppm of Moringa Oleifera extract biocide as a promising approach to control and prevent SRB biofilm formation on X52 carbon steel surfaces. Green extracts undergo evaluation for their effectiveness in disrupting biofilm development while ensuring the integrity of the steel substrate. Systematic analysis is conducted on the biocide's impact on the biofilm's structural integrity, microbial viability, and overall attachment strength. This two-pronged investigation aims to deepen our comprehension of SRB biofilm dynamics and contribute to the development of effective strategies for mitigating its impact on X52 carbon steel.Keywords: attachment, bio-corrosion, biofilm, metal/bacteria interface
Procedia PDF Downloads 761685 In vitro Evaluation of Prebiotic Potential of Wheat Germ
Authors: Lígia Pimentel, Miguel Pereira, Manuela Pintado
Abstract:
Wheat germ is a by-product of wheat flour refining. Despite this by-product being a source of proteins, lipids, fibres and complex carbohydrates, and consequently a valuable ingredient to be used in Food Industry, only few applications have been studied. The main goal of this study was to assess the potential prebiotic effect of natural wheat germ. The prebiotic potential was evaluated by in vitro assays with individual microbial strains (Lactobacillus paracasei L26 and Lactobacillus casei L431). A simulated model of the gastrointestinal digestion was also used including the conditions present in the mouth (artificial saliva), oesophagus–stomach (artificial gastric juice), duodenum (artificial intestinal juice) and ileum. The effect of natural wheat germ and wheat germ after digestion on the growth of lactic acid bacteria was studied by growing those microorganisms in de Man, Rogosa and Sharpe (MRS) broth (with 2% wheat germ and 1% wheat germ after digestion) and incubating at 37 ºC for 48 h with stirring. A negative control consisting of MRS broth without glucose was used and the substrate was also compared to a commercial prebiotic fructooligosaccharides (FOS). Samples were taken at 0, 3, 6, 9, 12, 24 and 48 h for bacterial cell counts (CFU/mL) and pH measurement. Results obtained showed that wheat germ has a stimulatory effect on the bacteria tested, presenting similar (or even higher) results to FOS, when comparing to the culture medium without glucose. This was demonstrated by the viable cell counts and also by the decrease on the medium pH. Both L. paracasei L26 and L. casei L431 could use these compounds as a substitute for glucose with an enhancement of growth. In conclusion, we have shown that wheat germ stimulate the growth of probiotic lactic acid bacteria. In order to understand if the composition of gut bacteria is altered and if wheat germ could be used as potential prebiotic, further studies including faecal fermentations should be carried out. Nevertheless, wheat germ seems to have potential to be a valuable compound to be used in Food Industry, mainly in the Bakery Industry.Keywords: by-products, functional ingredients, prebiotic potential, wheat germ
Procedia PDF Downloads 4911684 Biochemical Characterization of CTX-M-15 from Enterobacter cloacae and Designing a Novel Non-β-Lactam-β-Lactamase Inhibitor
Authors: Mohammad Faheem, M. Tabish Rehman, Mohd Danishuddin, Asad U. Khan
Abstract:
The worldwide dissemination of CTX-M type β-lactamases is a threat to human health. Previously, we have reported the spread of blaCTX-M-15 gene in different clinical strains of Enterobacteriaceae from the hospital settings of Aligarh in north India. In view of the varying resistance pattern against cephalosporins and other β-lactam antibiotics, we intended to understand the correlation between MICs and catalytic activity of CTX-M-15. In this study, steady-state kinetic parameters and MICs were determined on E. coli DH5α transformed with blaCTX-M-15 gene that was cloned from Enterobacter cloacae (EC-15) strain of clinical background. The effect of conventional β-lactamase inhibitors (clavulanic acid, sulbactam and tazobactam) on CTX-M-15 was also studied. We have found that tazobactam is the best among these inhibitors against CTX-M-15. The inhibition characteristic of tazobactam is defined by its very low IC50 value (6 nM), high affinity (Ki = 0.017 µM) and better acylation efficiency (k+2/K9 = 0.44 µM-1s-1). It forms an acyl-enzyme covalent complex, which is quite stable (k+3 = 0.0057 s-1). Since increasing resistance has been reported against conventional b-lactam antibiotic-inhibitor combinations, we aspire to design a non-b-lactam core containing b-lactamase inhibitor. For this, we screened ZINC database and performed molecular docking to identify a potential non-β-lactam based inhibitor (ZINC03787097). The MICs of cephalosporin antibiotics in combination with this inhibitor gave promising results. Steady-state kinetics and molecular docking studies showed that ZINC03787097 is a reversible inhibitor which binds non-covalently to the active site of the enzyme through hydrogen bonds and hydrophobic interactions. Though, it’s IC50 (180 nM) is much higher than tazobactam, it has good affinity for CTX-M-15 (Ki = 0.388 µM). This study concludes that ZINC03787097 compound can be used as seed molecule to design more efficient non-b-lactam containing b-lactamase inhibitor that could evade pre-existing bacterial resistance mechanisms.Keywords: ESBL, non-b-lactam-b-lactamase inhibitor, bioinformatics, biomedicine
Procedia PDF Downloads 2411683 Linguistic Analysis of Holy Scriptures: A Comparative Study of Islamic Jurisprudence and the Western Hermeneutical Tradition
Authors: Sana Ammad
Abstract:
The tradition of linguistic analysis in Islam and Christianity has developed independently of each other in lieu of the social developments specific to their historical context. However, recently increasing number of Muslim academics educated in the West have tried to apply the Western tradition of linguistic interpretation to the Qur’anic text while completely disregarding the Islamic linguistic tradition used and developed by the traditional scholars over the centuries. The aim of the paper is to outline the linguistic tools and methods used by the traditional Islamic scholars for the purpose of interpretating the Holy Qur’an and shed light on how they contribute towards a better understanding of the text compared to their Western counterparts. This paper carries out a descriptive-comparative study of the linguistic tools developed and perfected by the traditional scholars in Islam for the purpose of textual analysis of the Qur’an as they have been described in the authentic works of Usul Al Fiqh (Jurisprudence) and the principles of textual analysis employed by the Western hermeneutical tradition for the study of the Bible. First, it briefly outlines the independent historical development of the two traditions emphasizing the final normative shape that they have taken. Then it draws a comparison of the two traditions highlighting the similarities and the differences existing between them. In the end, the paper demonstrates the level of academic excellence achieved by the traditional linguistic scholars in their efforts to develop appropriate tools of textual interpretation and how these tools are more suitable for interpreting the Qur’an compared to the Western principles. Since the aim of interpreters of both the traditions is to try and attain an objective understanding of the Scriptures, the emphasis of the paper shall be to highlight how well the Islamic method of linguistic interpretation contributes to an objective understanding of the Qur’anic text. The paper concludes with the following findings: The Western hermeneutical tradition of linguistic analysis developed within the Western historical context. However, the Islamic method of linguistic analysis is much more highly developed and complex and serves better the purpose of objective understanding of the Holy text.Keywords: Islamic jurisprudence, linguistic analysis, textual interpretation, western hermeneutics
Procedia PDF Downloads 3351682 Effect of Print Orientation on the Mechanical Properties of Multi Jet Fusion Additively Manufactured Polyamide-12
Authors: Tyler Palma, Praveen Damasus, Michael Munther, Mehrdad Mohsenizadeh, Keivan Davami
Abstract:
The advancement of additive manufacturing, in both research and commercial realms, is highly dependent upon continuing innovations and creativity in materials and designs. Additive manufacturing shows great promise towards revolutionizing various industries, due largely to the fact that design data can be used to create complex products and components, on demand and from the raw materials, for the end user at the point of use. However, it will be critical that the material properties of additively-made parts for engineering purposes be fully understood. As it is a relatively new additive manufacturing method, the response of properties of Multi Jet Fusion (MJF) produced parts to different printing parameters has not been well studied. In this work, testing of mechanical and tribological properties MJF-printed Polyamide 12 parts was performed to determine whether printing orientation in this method results in significantly different part performances. Material properties were studied at macro- and nanoscales. Tensile tests, in combination with tribology tests including steady-state wear, were performed. Results showed a significant difference in resultant part characteristics based on whether they were printed in a vertical or horizontal orientation. Tensile performance of vertically and horizontally printed samples varied, both in ultimate strength and strain. Tribology tests showed that printing orientation has notable effects on the resulting mechanical and wear properties of tested surfaces, due largely to layer orientation and the presence of unfused fused powder grain inclusions. This research advances the understanding of how print orientation affects the mechanical properties of additively manufactured structures, and also how print orientation can be exploited in future engineering design.Keywords: additive manufacturing, indentation, nano mechanical characterization, print orientation
Procedia PDF Downloads 1441681 Educational Engineering Tool on Smartphone
Authors: Maya Saade, Rafic Younes, Pascal Lafon
Abstract:
This paper explores the transformative impact of smartphones on pedagogy and presents a smartphone application developed specifically for engineering problem-solving and educational purposes. The widespread availability and advanced capabilities of smartphones have revolutionized the way we interact with technology, including in education. The ubiquity of smartphones allows learners to access educational resources anytime and anywhere, promoting personalized and self-directed learning. The first part of this paper discusses the overall influence of smartphones on pedagogy, emphasizing their potential to improve learning experiences through mobile technology. In the context of engineering education, this paper focuses on the development of a dedicated smartphone application that serves as a powerful tool for both engineering problem-solving and education. The application features an intuitive and user-friendly interface, allowing engineering students and professionals to perform complex calculations and analyses on their smartphones. The smartphone application primarily focuses on beam calculations and serves as a comprehensive beam calculator tailored to engineering education. It caters to various engineering disciplines by offering interactive modules that allow students to learn key concepts through hands-on activities and simulations. With a primary emphasis on beam analysis, this application empowers users to perform calculations for statically determinate beams, statically indeterminate beams, and beam buckling phenomena. Furthermore, the app includes a comprehensive library of engineering formulas and reference materials, facilitating a deeper understanding and practical application of the fundamental principles in beam analysis. By offering a wide range of features specifically tailored for beam calculation, this application provides an invaluable tool for engineering students and professionals looking to enhance their understanding and proficiency in this crucial aspect of a structural engineer.Keywords: mobile devices in education, solving engineering problems, smartphone application, engineering education
Procedia PDF Downloads 671680 A Collaborative Problem Driven Approach to Design an HR Analytics Application
Authors: L. Atif, C. Rosenthal-Sabroux, M. Grundstein
Abstract:
The requirements engineering process is a crucial phase in the design of complex systems. The purpose of our research is to present a collaborative problem-driven requirements engineering approach that aims at improving the design of a Decision Support System as an Analytics application. This approach has been adopted to design a Human Resource management DSS. The Requirements Engineering process is presented as a series of guidelines for activities that must be implemented to assure that the final product satisfies end-users requirements and takes into account the limitations identified. For this, we know that a well-posed statement of the problem is “a problem whose crucial character arises from collectively produced estimation and a formulation found to be acceptable by all the parties”. Moreover, we know that DSSs were developed to help decision-makers solve their unstructured problems. So, we thus base our research off of the assumption that developing DSS, particularly for helping poorly structured or unstructured decisions, cannot be done without considering end-user decision problems, how to represent them collectively, decisions content, their meaning, and the decision-making process; thus, arise the field issues in a multidisciplinary perspective. Our approach addresses a problem-driven and collaborative approach to designing DSS technologies: It will reflect common end-user problems in the upstream design phase and in the downstream phase these problems will determine the design choices and potential technical solution. We will thus rely on a categorization of HR’s problems for a development mirroring the Analytics solution. This brings out a new data-driven DSS typology: Descriptive Analytics, Explicative or Diagnostic Analytics, Predictive Analytics, Prescriptive Analytics. In our research, identifying the problem takes place with design of the solution, so, we would have to resort a significant transformations of representations associated with the HR Analytics application to build an increasingly detailed representation of the goal to be achieved. Here, the collective cognition is reflected in the establishment of transfer functions of representations during the whole of the design process.Keywords: DSS, collaborative design, problem-driven requirements, analytics application, HR decision making
Procedia PDF Downloads 2971679 Identification of microRNAs in Early and Late Onset of Parkinson’s Disease Patient
Authors: Ahmad Rasyadan Arshad, A. Rahman A. Jamal, N. Mohamed Ibrahim, Nor Azian Abdul Murad
Abstract:
Introduction: Parkinson’s disease (PD) is a complex and asymptomatic disease where patients are usually diagnosed at late stage where about 70% of the dopaminergic neurons are lost. Therefore, identification of molecular biomarkers is crucial for early diagnosis of PD. MicroRNA (miRNA) is a short nucleotide non-coding small RNA which regulates the gene expression in post-translational process. The involvement of these miRNAs in neurodegenerative diseases includes maintenance of neuronal development, necrosis, mitochondrial dysfunction and oxidative stress. Thus, miRNA could be a potential biomarkers for diagnosis of PD. Objective: This study aim to identify the miRNA involved in Late Onset PD (LOPD) and Early Onset PD (EOPD) compared to the controls. Methods: This is a case-control study involved PD patients in the Chancellor Tunku Muhriz Hospital at the UKM Medical Centre. miRNA samples were extracted using miRNeasy serum/plasma kit from Qiagen. The quality of miRNA extracted was determined using Agilent RNA 6000 Nano kit in the Bioanalyzer. miRNA expression was performed using GeneChip miRNA 4.0 chip from Affymetrix. Microarray was performed in EOPD (n= 7), LOPD (n=9) and healthy control (n=11). Expression Console and Transcriptomic Analyses Console were used to analyze the microarray data. Result: miR-129-5p was significantly downregulated in EOPD compared to LOPD with -4.2 fold change (p = <0.050. miR-301a-3p was upregulated in EOPD compared to healthy control (fold = 10.3, p = <0.05). In LOPD versus healthy control, miR-486-3p (fold = 15.28, p = <0.05), miR-29c-3p (fold = 12.21, p = <0.05) and miR-301a-3p (fold = 10.01, p =< 0.05) were upregulated. Conclusion: Several miRNA have been identified to be differentially expressed in EOPD compared to LOPD and PD versus control. These miRNAs could serve as the potential biomarkers for early diagnosis of PD. However, these miRNAs need to be validated in a larger sample size.Keywords: early onset PD, late onset PD, microRNA (miRNA), microarray
Procedia PDF Downloads 2611678 Dosimetric Dependence on the Collimator Angle in Prostate Volumetric Modulated Arc Therapy
Authors: Muhammad Isa Khan, Jalil Ur Rehman, Muhammad Afzal Khan Rao, James Chow
Abstract:
Purpose: This study investigates the dose-volume variations in planning target volume (PTV) and organs-at-risk (OARs) using different collimator angles for smart arc prostate volumetric modulated arc therapy (VMAT). Awareness of the collimator angle for PTV and OARs sparing is essential for the planner because optimization contains numerous treatment constraints producing a complex, unstable and computationally challenging problem throughout its examination of an optimal plan in a rational time. Materials and Methods: Single arc VMAT plans at different collimator angles varied systematically (0°-90°) were performed on a Harold phantom and a new treatment plan is optimized for each collimator angle. We analyzed the conformity index (CI), homogeneity index (HI), gradient index (GI), monitor units (MUs), dose-volume histogram, mean and maximum doses to PTV. We also explored OARs (e.g. bladder, rectum and femoral heads), dose-volume criteria in the treatment plan (e.g. D30%, D50%, V30Gy and V38Gy of bladder and rectum; D5%,V14Gy and V22Gy of femoral heads), dose-volume histogram, mean and maximum doses for smart arc VMAT at different collimator angles. Results: There was no significance difference found in VMAT optimization at all studied collimator angles. However, if 0.5% accuracy is concerned then collimator angle = 45° provides higher CI and lower HI. Collimator angle = 15° also provides lower HI values like collimator angle 45°. It is seen that collimator angle = 75° is established as a good for rectum and right femur sparing. Collimator angle = 90° and collimator angle = 30° were found good for rectum and left femur sparing respectively. The PTV dose coverage statistics for each plan are comparatively independent of the collimator angles. Conclusion: It is concluded that this study will help the planner to have freedom to choose any collimator angle from (0°-90°) for PTV coverage and select a suitable collimator angle to spare OARs.Keywords: VMAT, dose-volume histogram, collimator angle, organs-at-risk
Procedia PDF Downloads 5141677 A Pragmatic Approach of Memes Created in Relation to the COVID-19 Pandemic
Authors: Alexandra-Monica Toma
Abstract:
Internet memes are an element of computer mediated communication and an important part of online culture that combines text and image in order to generate meaning. This term coined by Richard Dawkings refers to more than a mere way to briefly communicate ideas or emotions, thus naming a complex and an intensely perpetuated phenomenon in the virtual environment. This paper approaches memes as a cultural artefact and a virtual trope that mirrors societal concerns and issues, and analyses the pragmatics of their use. Memes have to be analysed in series, usually relating to some image macros, which is proof of the interplay between imitation and creativity in the memes’ writing process. We believe that their potential to become viral relates to three key elements: adaptation to context, reference to a successful meme series, and humour (jokes, irony, sarcasm), with various pragmatic functions. The study also uses the concept of multimodality and stresses how the memes’ text interacts with the image, discussing three types of relations: symmetry, amplification, and contradiction. Moreover, the paper proves that memes could be employed as speech acts with illocutionary force, when the interaction between text and image is enriched through the connection to a specific situation. The features mentioned above are analysed in a corpus that consists of memes related to the COVID-19 pandemic. This corpus shows them to be highly adaptable to context, which helps build the feeling of connection and belonging in an otherwise tremendously fragmented world. Some of them are created based on well-known image macros, and their humour results from an intricate dialogue between texts and contexts. Memes created in relation to the COVID-19 pandemic can be considered speech acts and are often used as such, as proven in the paper. Consequently, this paper tackles the key features of memes, makes a thorough analysis of the memes sociocultural, linguistic, and situational context, and emphasizes their intertextuality, with special accent on their illocutionary potential.Keywords: context, memes, multimodality, speech acts
Procedia PDF Downloads 2061676 Congenital Heart Defect(CHD) “The Silent Crises”; The Need for New Innovative Ways to Save the Ghanaian Child - A Retrospective Study
Authors: Priscilla Akua Agyapong
Abstract:
Background: In a country of nearly 34 million people, Ghana suffers from rapidly growing pediatric CHD cases and not enough pediatric specialists to attend to the burgeoning needs of these children. Most of the cases are either missed or diagnosed late, resulting in increased mortality. According to the National Cardiothoracic Centre, 1 in every 100,000 births in Ghana has CHD; however, there is limited data on the clinical presentation and its management, one of the many reasons I decided to do this case study coupled with the loss my 2 month old niece to multiple Ventricular Septal Defect 3 years ago due late diagnoses. Method: A retrospective cohort study was performed at the child health clinic of one of Ghana’s public tertiary Institutions using data from their electronic health record (EHR) from February 2021 to April 2022. All suspected or provisionally diagnosed cases were included in the analysis. Results: Records of over 3000 children were reviewed with an approximate male to female ratio of 1:1.53 cases diagnosed during the period of study, most of whom were less than 5 years of age. 25 cases had complete clinical records, with acyanotic septal defects being the most diagnosed. 62.5% of the cases were ventricular septal defects, followed by Patent Ductus Arteriosus (23%) and Atrial Septal Defects (4.5%). Tetralogy of Fallot was the most predominant and complex cyanotic CHD with 10%. Conclusion: The indeterminate coronary anatomy of infants makes it difficult to use only echocardiography and other conventional clinical methods in screening for CHDs. There are rising modernizations and new innovative ways that can be employed in Ghana for early detection, hence preventing the delay of a potential surgical repair. It is, therefore, imperative to create the needed awareness about these “SILENT CRISES” and help save the Ghanaian child’s life.Keywords: congenital heart defect(CHD), ventricular septal defect(VSD), atrial septal defect(ASD), patent ductus arteriosus(PDA)
Procedia PDF Downloads 901675 Using Machine Learning to Classify Different Body Parts and Determine Healthiness
Authors: Zachary Pan
Abstract:
Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.Keywords: body part, healthcare, machine learning, neural networks
Procedia PDF Downloads 1101674 Polymer Mixing in the Cavity Transfer Mixer
Authors: Giovanna Grosso, Martien A. Hulsen, Arash Sarhangi Fard, Andrew Overend, Patrick. D. Anderson
Abstract:
In many industrial applications and, in particular in polymer industry, the quality of mixing between different materials is fundamental to guarantee the desired properties of finished products. However, properly modelling and understanding polymer mixing often presents noticeable difficulties, because of the variety and complexity of the physical phenomena involved. This is the case of the Cavity Transfer Mixer (CTM), for which a clear understanding of mixing mechanisms is still missing, as well as clear guidelines for the system optimization. This device, invented and patented by Gale at Rapra Technology Limited, is an add-on to be mounted downstream of existing extruders, in order to improve distributive mixing. It consists of two concentric cylinders, the rotor and stator, both provided with staggered rows of hemispherical cavities. The inner cylinder (rotor) rotates, while the outer (stator) remains still. At the same time, the pressure load imposed upstream, pushes the fluid through the CTM. Mixing processes are driven by the flow field generated by the complex interaction between the moving geometry, the imposed pressure load and the rheology of the fluid. In such a context, the present work proposes a complete and accurate three dimensional modelling of the CTM and results of a broad range of simulations assessing the impact on mixing of several geometrical and functioning parameters. Among them, we find: the number of cavities per row, the number of rows, the size of the mixer, the rheology of the fluid and the ratio between the rotation speed and the fluid throughput. The model is composed of a flow part and a mixing part: a finite element solver computes the transient velocity field, which is used in the mapping method implementation in order to simulate the concentration field evolution. Results of simulations are summarized in guidelines for the device optimization.Keywords: Mixing, non-Newtonian fluids, polymers, rheology.
Procedia PDF Downloads 3811673 Training for Digital Manufacturing: A Multilevel Teaching Model
Authors: Luís Rocha, Adam Gąska, Enrico Savio, Michael Marxer, Christoph Battaglia
Abstract:
The changes observed in the last years in the field of manufacturing and production engineering, popularly known as "Fourth Industry Revolution", utilizes the achievements in the different areas of computer sciences, introducing new solutions at almost every stage of the production process, just to mention such concepts as mass customization, cloud computing, knowledge-based engineering, virtual reality, rapid prototyping, or virtual models of measuring systems. To effectively speed up the production process and make it more flexible, it is necessary to tighten the bonds connecting individual stages of the production process and to raise the awareness and knowledge of employees of individual sectors about the nature and specificity of work in other stages. It is important to discover and develop a suitable education method adapted to the specificities of each stage of the production process, becoming an extremely crucial issue to exploit the potential of the fourth industrial revolution properly. Because of it, the project “Train4Dim” (T4D) intends to develop complex training material for digital manufacturing, including content for design, manufacturing, and quality control, with a focus on coordinate metrology and portable measuring systems. In this paper, the authors present an approach to using an active learning methodology for digital manufacturing. T4D main objective is to develop a multi-degree (apprenticeship up to master’s degree studies) and educational approach that can be adapted to different teaching levels. It’s also described the process of creating the underneath methodology. The paper will share the steps to achieve the aims of the project (training model for digital manufacturing): 1) surveying the stakeholders, 2) Defining the learning aims, 3) producing all contents and curriculum, 4) training for tutors, and 5) Pilot courses test and improvements.Keywords: learning, Industry 4.0, active learning, digital manufacturing
Procedia PDF Downloads 1011672 Effects of Gamma-Tocotrienol Supplementation on T-Regulatory Cells in Syngeneic Mouse Model of Breast Cancer
Authors: S. Subramaniam, J. S. A. Rao, P. Ramdas, K. R. Selvaduray, N. M. Han, M. K. Kutty, A. K. Radhakrishnan
Abstract:
Immune system is a complex system where the immune cells have the capability to respond against a wide range of immune challenges including cancer progression. However, in the event of cancer development, tumour cells trigger immunosuppressive environment via activation of myeloid-derived suppressor cells and T regulatory (Treg) cells. The Treg cells are subset of CD4+ T lymphocytes, known to have crucial roles in regulating immune homeostasis and promoting the establishment and maintenance of peripheral tolerance. Dysregulation of these mechanisms could lead to cancer progression and immune suppression. Recently, there are many studies reporting on the effects of natural bioactive compounds on immune responses against cancer. It was known that tocotrienol-rich-fraction consisting 70% tocotrienols and 30% α-tocopherol is able to exhibit immunomodulatory as well as anti-cancer properties. Hence, this study was designed to evaluate the effects of gamma-tocotrienol (G-T3) supplementation on T-reg cells in a syngeneic mouse model of breast cancer. In this study, female BALB/c mice were divided into two groups and fed with either soy oil (vehicle) or gamma-tocotrienol (G-T3) for two weeks followed by inoculation with tumour cells. All the mice continued to receive the same supplementation until day 49. The results showed a significant reduction in tumour volume and weight in G-T3 fed mice compared to vehicle-fed mice. Lung and liver histology showed reduced evidence of metastasis in tumour-bearing G-T3 fed mice. Besides that, flow cytometry analysis revealed T-helper cell population was increased, and T-regulatory cell population was suppressed following G-T3 supplementation. Moreover, immunohistochemistry analysis showed that there was a marked decrease in the expression of FOXP3 in the G-T3 fed tumour bearing mice. In conclusion, the G-T3 supplementation showed good prognosis towards breast cancer by enhancing the immune response in tumour-bearing mice. Therefore, gamma-T3 can be used as immunotherapy agent for the treatment of breast cancer.Keywords: breast cancer, gamma tocotrienol, immune suppression, supplement
Procedia PDF Downloads 2251671 Spatial Planning and Tourism Development with Sustainability Model of the Territorial Tourist with Land Use Approach
Authors: Mehrangiz Rezaee, Zabih Charrahi
Abstract:
In the last decade, with increasing tourism destinations and tourism growth, we are witnessing the widespread impacts of tourism on the economy, environment and society. Tourism and its related economy are now undergoing a transformation and as one of the key pillars of business economics, it plays a vital role in the world economy. Activities related to tourism and providing services appropriate to it in an area, like many economic sectors, require the necessary context on its origin. Given the importance of tourism industry and tourism potentials of Yazd province in Iran, it is necessary to use a proper procedure for prioritizing different areas for proper and efficient planning. One of the most important goals of planning is foresight and creating balanced development in different geographical areas. This process requires an accurate study of the areas and potential and actual talents, as well as evaluation and understanding of the relationship between the indicators affecting the development of the region. At the global and regional level, the development of tourist resorts and the proper distribution of tourism destinations are needed to counter environmental impacts and risks. The main objective of this study is the sustainable development of suitable tourism areas. Given that tourism activities in different territorial areas require operational zoning, this study deals with the evaluation of territorial tourism using concepts such as land use, fitness and sustainable development. It is essential to understand the structure of tourism development and the spatial development of tourism using land use patterns, spatial planning and sustainable development. Tourism spatial planning implements different approaches. However, the development of tourism as well as the spatial development of tourism is complex, since tourist activities can be carried out in different areas with different purposes. Multipurpose areas have great important for tourism because it determines the flow of tourism. Therefore, in this paper, by studying the development and determination of tourism suitability that is related to spatial development, it is possible to plan tourism spatial development by developing a model that describes the characteristics of tourism. The results of this research determine the suitability of multi-functional territorial tourism development in line with spatial planning of tourism.Keywords: land use change, spatial planning, sustainability, territorial tourist, Yazd
Procedia PDF Downloads 1841670 A Computational Framework for Decoding Hierarchical Interlocking Structures with SL Blocks
Authors: Yuxi Liu, Boris Belousov, Mehrzad Esmaeili Charkhab, Oliver Tessmann
Abstract:
This paper presents a computational solution for designing reconfigurable interlocking structures that are fully assembled with SL Blocks. Formed by S-shaped and L-shaped tetracubes, SL Block is a specific type of interlocking puzzle. Analogous to molecular self-assembly, the aggregation of SL blocks will build a reversible hierarchical and discrete system where a single module can be numerously replicated to compose semi-interlocking components that further align, wrap, and braid around each other to form complex high-order aggregations. These aggregations can be disassembled and reassembled, responding dynamically to design inputs and changes with a unique capacity for reconfiguration. To use these aggregations as architectural structures, we developed computational tools that automate the configuration of SL blocks based on architectural design objectives. There are three critical phases in our work. First, we revisit the hierarchy of the SL block system and devise a top-down-type design strategy. From this, we propose two key questions: 1) How to translate 3D polyominoes into SL block assembly? 2) How to decompose the desired voxelized shapes into a set of 3D polyominoes with interlocking joints? These two questions can be considered the Hamiltonian path problem and the 3D polyomino tiling problem. Then, we derive our solution to each of them based on two methods. The first method is to construct the optimal closed path from an undirected graph built from the voxelized shape and translate the node sequence of the resulting path into the assembly sequence of SL blocks. The second approach describes interlocking relationships of 3D polyominoes as a joint connection graph. Lastly, we formulate the desired shapes and leverage our methods to achieve their reconfiguration within different levels. We show that our computational strategy will facilitate the efficient design of hierarchical interlocking structures with a self-replicating geometric module.Keywords: computational design, SL-blocks, 3D polyomino puzzle, combinatorial problem
Procedia PDF Downloads 1321669 Exploring the Interplay of Attention, Awareness, and Control: A Comprehensive Investigation
Authors: Venkateswar Pujari
Abstract:
This study tries to investigate the complex interplay between control, awareness, and attention in human cognitive processes. The fundamental elements of cognitive functioning that play a significant role in influencing perception, decision-making, and behavior are attention, awareness, and control. Understanding how they interact can help us better understand how our minds work and may even increase our understanding of cognitive science and its therapeutic applications. The study uses an empirical methodology to examine the relationships between attention, awareness, and control by integrating different experimental paradigms and neuropsychological tests. To ensure the generalizability of findings, a wide sample of participants is chosen, including people with various cognitive profiles and ages. The study is structured into four primary parts, each of which focuses on one component of how attention, awareness, and control interact: 1. Evaluation of Attentional Capacity and Selectivity: In this stage, participants complete established attention tests, including the Stroop task and visual search tasks. 2. Evaluation of Awareness Degrees: In the second stage, participants' degrees of conscious and unconscious awareness are assessed using perceptual awareness tasks such as masked priming and binocular rivalry tasks. 3. Investigation of Cognitive Control Mechanisms: In the third phase, reaction inhibition, cognitive flexibility, and working memory capacity are investigated using exercises like the Wisconsin Card Sorting Test and the Go/No-Go paradigm. 4. Results Integration and Analysis: Data from all phases are integrated and analyzed in the final phase. To investigate potential links and prediction correlations between attention, awareness, and control, correlational and regression analyses are carried out. The study's conclusions shed light on the intricate relationships that exist between control, awareness, and attention throughout cognitive function. The findings may have consequences for cognitive psychology, neuroscience, and clinical psychology by providing new understandings of cognitive dysfunctions linked to deficiencies in attention, awareness, and control systems.Keywords: attention, awareness, control, cognitive functioning, neuropsychological assessment
Procedia PDF Downloads 921668 Dynamic Modeling of Advanced Wastewater Treatment Plants Using BioWin
Authors: Komal Rathore, Aydin Sunol, Gita Iranipour, Luke Mulford
Abstract:
Advanced wastewater treatment plants have complex biological kinetics, time variant influent flow rates and long processing times. Due to these factors, the modeling and operational control of advanced wastewater treatment plants become complicated. However, development of a robust model for advanced wastewater treatment plants has become necessary in order to increase the efficiency of the plants, reduce energy costs and meet the discharge limits set by the government. A dynamic model was designed using the Envirosim (Canada) platform software called BioWin for several wastewater treatment plants in Hillsborough County, Florida. Proper control strategies for various parameters such as mixed liquor suspended solids, recycle activated sludge and waste activated sludge were developed for models to match the plant performance. The models were tuned using both the influent and effluent data from the plant and their laboratories. The plant SCADA was used to predict the influent wastewater rates and concentration profiles as a function of time. The kinetic parameters were tuned based on sensitivity analysis and trial and error methods. The dynamic models were validated by using experimental data for influent and effluent parameters. The dissolved oxygen measurements were taken to validate the model by coupling them with Computational Fluid Dynamics (CFD) models. The Biowin models were able to exactly mimic the plant performance and predict effluent behavior for extended periods. The models are useful for plant engineers and operators as they can take decisions beforehand by predicting the plant performance with the use of BioWin models. One of the important findings from the model was the effects of recycle and wastage ratios on the mixed liquor suspended solids. The model was also useful in determining the significant kinetic parameters for biological wastewater treatment systems.Keywords: BioWin, kinetic modeling, flowsheet simulation, dynamic modeling
Procedia PDF Downloads 1571667 Deprivation of Adivasi People's Rights to Forest Resources: A Case Study from United Andhra Pradesh India
Authors: Anil Kumar Kursenge
Abstract:
In the State of united Andhra Pradesh, many Adivasi People live in areas rich in living and non-living resources, including forests that contain abundant biodiversity, water and minerals. Of united Andhra Pradesh 76.2m population, over five million are Adivasi population of forest landscape. They depend on forests for a substantial part of their livelihoods and close cultural affinity with forests. However, they are the most impoverished population of the State, and the high levels of poverty in Andhra Pradesh forest landscapes are largely an outcome of historically-rooted institutionalised marginalisation. As the State appropriated forests and forest land for itself, it deprived local people of their customary rights in the forest. The local realities of the forest rights deprivations are extremely complex, reflecting a century and a half of compounded processes. With growing population pressure and ever-increasing demands for natural and mineral resources, Adivasi Peoples' lands, which are often relatively rich in resources, become more and more attractive to 'developers.' The development projects and institutionalised marginalisation have been deprived Adivasi people's rights over natural resources has resulted in serious negative effects on Adivasi people and on their lands. Historically, the desire for development for such resources has resulted in the removal, decimation, or extermination of many tribal communities. These deprivations have led to highly conflictual relations between the State and the Adivasi people and forest areas in Andhra Pradesh. Today, the survival of the Adivasi Peoples requires recognition of their rights to the forest resources found in their lands and territories on which they depend for their economic, cultural, survival, spiritual and physical well-being. In this context, this paper attempts to discuss the issues of deprivation with regard to access to forest resources and development projects where many Adivasis in State uprooted from their homes and lands.Keywords: tribal people, forest rights, livelihoods, deprivation, marginalisation, Andhra Pradesh
Procedia PDF Downloads 2031666 Preparation and Characterization of Dendrimer-Encapsulated Ytterbium Nanoparticles to Produce a New Nano-Radio Pharmaceutical
Authors: Aghaei Amirkhizi Navideh, Sadjadi Soodeh Sadat, Moghaddam Banaem Leila, Athari Allaf Mitra, Johari Daha Fariba
Abstract:
Dendrimers are good candidates for preparing metal nanoparticles because they can structurally and chemically well-defined templates and robust stabilizers. Poly amidoamine (PAMAM) dendrimer-based multifunctional cancer therapeutic conjugates have been designed and synthesized in pharmaceutical industry. In addition, encapsulated nanoparticle surfaces are accessible to substrates so that catalytic reactions can be carried out. For preparation of dendimer-metal nanocomposite, a dendrimer solution containing an average of 55 Yb+3 ions per dendrimer was prepared. Prior to reduction, the pH of this solution was adjusted to 7.5 using NaOH. NaBH4 was used to reduce the dendrimer-encapsulated Yb+3 to the zerovalent metal. The pH of the resulting solution was then adjusted to 3, using HClO4, to decompose excess BH4-. The UV-Vis absorption spectra of the mixture were recorded to ensure the formation of Yb-G5-NH2 complex. High-resolution electron microscopy (HRTEM) and size distribution results provide additional information about dendimer-metal nanocomposite shape, size, and size distribution of the particles. The resulting mixture was irradiated in Tehran Research Reactor 2h and neutron fluxes were 3×1011 n/cm2.Sec and the specific activity was 7MBq. Radiochemical and chemical and radionuclide quality control testes were carried. Gamma Spectroscopy and High-performance Liquid Chromatography HPLC, Thin-Layer Chromatography TLC were recorded. The injection of resulting solution to solid tumor in mice shows that it could be resized the tumor. The studies about solid tumors and nano composites show that ytterbium encapsulated-dendrimer radiopharmaceutical could be introduced as a new therapeutic for the treatment of solid tumors.Keywords: nano-radio pharmaceutical, ytterbium, PAMAM, dendrimers
Procedia PDF Downloads 5051665 The Impact of Legislation on Waste and Losses in the Food Processing Sector in the UK/EU
Authors: David Lloyd, David Owen, Martin Jardine
Abstract:
Introduction: European weight regulations with respect to food products require a full understanding of regulation guidelines to assure regulatory compliance. It is suggested that the complexity of regulation leads to practices which result to over filling of food packages by food processors. Purpose: To establish current practices by food processors and the financial, sustainable and societal impacts on the food supply chain of ineffective food production practices. Methods: An analysis of food packing controls with 10 companies of varying food categories and quantitative based research of a further 15 food processes on the confidence in weight control analysis of finished food packs within their organisation. Results: A process floor analysis of manufacturing operations focussing on 10 products found over fill of packages ranging from 4.8% to 20.2%. Standard deviation figures for all products showed a potential for reducing average weight of the pack whilst still retain the legal status of the product. In 20% of cases, an automatic weight analysis machine was in situ however weight packs were still significantly overweight. Collateral impacts noted included the effect of overfill on raw material purchase and added food miles often on a global basis with one raw material alone creating 10,000 extra food miles due to the poor weight control of the processing unit. A case study of a meat and bakery product will be discussed with the impact of poor controls resulting from complex legislation. The case studies will highlight extra energy costs in production and the impact of the extra weight on fuel usage. If successful a risk assessment model used primarily on food safety but adapted to identify waste /sustainability risks will be discussed within the presentation.Keywords: legislation, overfill, profile, waste
Procedia PDF Downloads 4091664 Improving Lane Detection for Autonomous Vehicles Using Deep Transfer Learning
Authors: Richard O’Riordan, Saritha Unnikrishnan
Abstract:
Autonomous Vehicles (AVs) are incorporating an increasing number of ADAS features, including automated lane-keeping systems. In recent years, many research papers into lane detection algorithms have been published, varying from computer vision techniques to deep learning methods. The transition from lower levels of autonomy defined in the SAE framework and the progression to higher autonomy levels requires increasingly complex models and algorithms that must be highly reliable in their operation and functionality capacities. Furthermore, these algorithms have no room for error when operating at high levels of autonomy. Although the current research details existing computer vision and deep learning algorithms and their methodologies and individual results, the research also details challenges faced by the algorithms and the resources needed to operate, along with shortcomings experienced during their detection of lanes in certain weather and lighting conditions. This paper will explore these shortcomings and attempt to implement a lane detection algorithm that could be used to achieve improvements in AV lane detection systems. This paper uses a pre-trained LaneNet model to detect lane or non-lane pixels using binary segmentation as the base detection method using an existing dataset BDD100k followed by a custom dataset generated locally. The selected roads will be modern well-laid roads with up-to-date infrastructure and lane markings, while the second road network will be an older road with infrastructure and lane markings reflecting the road network's age. The performance of the proposed method will be evaluated on the custom dataset to compare its performance to the BDD100k dataset. In summary, this paper will use Transfer Learning to provide a fast and robust lane detection algorithm that can handle various road conditions and provide accurate lane detection.Keywords: ADAS, autonomous vehicles, deep learning, LaneNet, lane detection
Procedia PDF Downloads 1081663 Methodology of Automation and Supervisory Control and Data Acquisition for Restructuring Industrial Systems
Authors: Lakhoua Najeh
Abstract:
Introduction: In most situations, an industrial system already existing, conditioned by its history, its culture and its context are in difficulty facing the necessity to restructure itself in an organizational and technological environment in perpetual evolution. This is why all operations of restructuring first of all require a diagnosis based on a functional analysis. After a presentation of the functionality of a supervisory system for complex processes, we present the concepts of industrial automation and supervisory control and data acquisition (SCADA). Methods: This global analysis exploits the various available documents on the one hand and takes on the other hand in consideration the various testimonies through investigations, the interviews or the collective workshops; otherwise, it also takes observations through visits as a basis and even of the specific operations. The exploitation of this diagnosis enables us to elaborate the project of restructuring thereafter. Leaving from the system analysis for the restructuring of industrial systems, and after a technical diagnosis based on visits, an analysis of the various technical documents and management as well as on targeted interviews, a focusing retailing the various levels of analysis has been done according a general methodology. Results: The methodology adopted in order to contribute to the restructuring of industrial systems by its participative and systemic character and leaning on a large consultation a lot of human resources that of the documentary resources, various innovating actions has been proposed. These actions appear in the setting of the TQM gait requiring applicable parameter quantification and a treatment valorising some information. The new management environment will enable us to institute an information and communication system possibility of migration toward an ERP system. Conclusion: Technological advancements in process monitoring, control and industrial automation over the past decades have contributed greatly to improve the productivity of virtually all industrial systems throughout the world. This paper tries to identify the principles characteristics of a process monitoring, control and industrial automation in order to provide tools to help in the decision-making process.Keywords: automation, supervision, SCADA, TQM
Procedia PDF Downloads 1811662 Optimizing Perennial Plants Image Classification by Fine-Tuning Deep Neural Networks
Authors: Khairani Binti Supyan, Fatimah Khalid, Mas Rina Mustaffa, Azreen Bin Azman, Amirul Azuani Romle
Abstract:
Perennial plant classification plays a significant role in various agricultural and environmental applications, assisting in plant identification, disease detection, and biodiversity monitoring. Nevertheless, attaining high accuracy in perennial plant image classification remains challenging due to the complex variations in plant appearance, the diverse range of environmental conditions under which images are captured, and the inherent variability in image quality stemming from various factors such as lighting conditions, camera settings, and focus. This paper proposes an adaptation approach to optimize perennial plant image classification by fine-tuning the pre-trained DNNs model. This paper explores the efficacy of fine-tuning prevalent architectures, namely VGG16, ResNet50, and InceptionV3, leveraging transfer learning to tailor the models to the specific characteristics of perennial plant datasets. A subset of the MYLPHerbs dataset consisted of 6 perennial plant species of 13481 images under various environmental conditions that were used in the experiments. Different strategies for fine-tuning, including adjusting learning rates, training set sizes, data augmentation, and architectural modifications, were investigated. The experimental outcomes underscore the effectiveness of fine-tuning deep neural networks for perennial plant image classification, with ResNet50 showcasing the highest accuracy of 99.78%. Despite ResNet50's superior performance, both VGG16 and InceptionV3 achieved commendable accuracy of 99.67% and 99.37%, respectively. The overall outcomes reaffirm the robustness of the fine-tuning approach across different deep neural network architectures, offering insights into strategies for optimizing model performance in the domain of perennial plant image classification.Keywords: perennial plants, image classification, deep neural networks, fine-tuning, transfer learning, VGG16, ResNet50, InceptionV3
Procedia PDF Downloads 691661 Experimental Pain Study Investigating the Distinction between Pain and Relief Reports
Authors: Abeer F. Almarzouki, Christopher A. Brown, Richard J. Brown, Anthony K. P. Jones
Abstract:
Although relief is commonly assumed to be a direct reflection of pain reduction, it seems to be driven by complex emotional interactions in which pain reduction is only one component. For example, termination of a painful/aversive event may be relieving and rewarding. Accordingly, in this study, whether terminating an aversive negative prediction of pain would be reflected in a greater relief experience was investigated, with a view to separating apart the effects of the manipulation on pain and relief. We use aversive conditioning paradigm to investigate the perception of relief in an aversive (threat) vs. positive context. Participants received positive predictors of a non-painful outcome which were presented within either a congruent positive (non-painful) context or an incongruent threat (painful) context that had been previously conditioned; trials followed by identical laser stimuli on both conditions. Participants were asked to rate the perceived intensity of pain as well as their perception of relief in response to the cue predicting the outcome. Results demonstrated that participants reported more pain in the aversive context compared to the positive context. Conversely, participants reported more relief in the aversive context compares to the neutral context. The rating of relief in the threat context was not correlated with pain reports. The results suggest that relief is not dependant on pain intensity. Consistent with this, relief in the threat context was greater than that in the positive expectancy condition, while the opposite pattern was obtained for the pain ratings. The value of relief in this study is better appreciated in the context of an impending negative threat, which is apparent in the higher pain ratings in the prior negative expectancy compared to the positive expectancy condition. Moreover, the more threatening the context (as manifested by higher unpleasantness/higher state anxiety scores), the more the relief is appreciated. The importance of the study highlights the importance of exploring relief and pain intensity in monitoring separately or evaluating pain-related suffering. The results also illustrate that the perception of painful input may largely be shaped by the context and not necessarily stimulus-related.Keywords: aversive context, pain, predictions, relief
Procedia PDF Downloads 1401660 A Strategy for Reducing Dynamic Disorder in Small Molecule Organic Semiconductors by Suppressing Large Amplitude Thermal Motions
Authors: Steffen Illig, Alexander S. Eggeman, Alessandro Troisi, Stephen G. Yeates, John E. Anthony, Henning Sirringhaus
Abstract:
Large-amplitude intermolecular vibrations in combination with complex shaped transfer integrals generate a thermally fluctuating energetic landscape. The resulting dynamic disorder and its intrinsic presence in organic semiconductors is one of the most fundamental differences to their inorganic counterparts. Dynamic disorder is believed to govern many of the unique electrical and optical properties of organic systems. However, the low energy nature of these vibrations makes it difficult to access them experimentally and because of this we still lack clear molecular design rules to control and reduce dynamic disorder. Applying a novel technique based on electron diffraction we encountered strong intermolecular, thermal vibrations in every single organic material we studied (14 up to date), indicating that a large degree of dynamic disorder is a universal phenomenon in organic crystals. In this paper a new molecular design strategy will be presented to avoid dynamic disorder. We found that small molecules that have their side chains attached to the long axis of their conjugated core have been found to be less likely to suffer from dynamic disorder effects. In particular, we demonstrate that 2,7-dioctyl[1]benzothieno[3,2-b][1]benzothio-phene (C8-BTBT) and 2,9-di-decyl-dinaphtho-[2,3-b:20,30-f]-thieno-[3,2-b]-thiophene (C10DNTT) exhibit strongly reduced thermal vibrations in comparison to other molecules and relate their outstanding performance to their lower dynamic disorder. We rationalize the low degree of dynamic disorder in C8-BTBT and C10-DNTT with a better encapsulation of the conjugated cores in the crystal structure which helps reduce large amplitude thermal motions. The work presented in this paper provides a general strategy for the design of new classes of very high mobility organic semiconductors with low dynamic disorder.Keywords: charge transport, C8-BTBT, C10-DNTT, dynamic disorder, organic semiconductors, thermal vibrations
Procedia PDF Downloads 4011659 Neuropsychological Deficits in Drug-Resistant Epilepsy
Authors: Timea Harmath-Tánczos
Abstract:
Drug-resistant epilepsy (DRE) is defined as the persistence of seizures despite at least two syndrome-adapted antiseizure drugs (ASD) used at efficacious daily doses. About a third of patients with epilepsy suffer from drug resistance. Cognitive assessment has a crucial role in the diagnosis and clinical management of epilepsy. Previous studies have addressed the clinical targets and indications for measuring neuropsychological functions; best to our knowledge, no studies have examined it in a Hungarian therapy-resistant population. To fill this gap, we investigated the Hungarian diagnostic protocol between 18 and 65 years of age. This study aimed to describe and analyze neuropsychological functions in patients with drug-resistant epilepsy and identify factors associated with neuropsychology deficits. We perform a prospective case-control study comparing neuropsychological performances in 50 adult patients and 50 healthy individuals between March 2023 and July 2023. Neuropsychological functions were examined in both patients and controls using a full set of specific tests (general performance level, motor functions, attention, executive facts., verbal and visual memory, language, and visual-spatial functions). Potential risk factors for neuropsychological deficit were assessed in the patient group using a multivariate analysis. The two groups did not differ in age, sex, dominant hand and level of education. Compared with the control group, patients with drug-resistant epilepsy showed worse performance on motor functions and visuospatial memory, sustained attention, inhibition and verbal memory. Neuropsychological deficits could therefore be systematically detected in patients with drug-resistant epilepsy in order to provide neuropsychological therapy and improve quality of life. The analysis of the classical and complex indices of the special neuropsychological tasks presented in the presentation can help in the investigation of normal and disrupted memory and executive functions in the DRE.Keywords: drug-resistant epilepsy, Hungarian diagnostic protocol, memory, executive functions, cognitive neuropsychology
Procedia PDF Downloads 771658 Combinational Therapeutic Targeting of BRD4 and CDK7 Synergistically Induces Anticancer Effects in Hepatocellular Carcinoma
Authors: Xinxiu Li, Chuqian Zheng, Yanyan Qian, Hong Fan
Abstract:
Objectives: In hepatocellular carcinoma (HCC), oncogenes are continuously and robustly transcribed due to aberrant expression of essential components of the trans-acting super-enhancers (SE) complex. Preclinical and clinical trials are now being conducted on small-molecule inhibitors that target core-transcriptional components, including as transcriptional bromodomain protein 4 (BRD4) and cyclin-dependent kinase 7 (CDK7), in a number of malignant tumors. This study aims to explore whether co-overexpression of BRD4 and CDK7 is a potential marker of worse prognosis and a combined therapeutic target in HCC. Methods: The expression pattern of BRD4 and CDK7 and their correlation with prognosis in HCC were analyzed by RNA sequencing data and survival data of HCC patients from TCGA and GEO datasets. The protein levels of BRD4 and CDK7 were determined by immunohistochemistry (IHC), and survival data of patients were analyzed using the Kaplan-Meier method. The mRNA expression levels of genes in HCC cell lines were evaluated by quantitative PCR (q-PCR). CCK-8 and colony formation assays were conducted to assess cell proliferation of HCC upon treatment with BRD4 inhibitor JQ1 or/and CDK7 inhibitor THZ1. Results: It was shown that BRD4 and CDK7 were often overexpressed in HCCs and were associated with poor prognosis of HCC by analyzing the TCGA and GEO datasets. BRD4 or CDK7 overexpression was related to a lower survival rate. It's interesting to note that co-overexpression of CDK7 and BRD4 was a worse prognostic factor in HCC. Treatment with JQ1 or THZ1 alone had an inhibitory effect on cell proliferation; however, when JQ1 and THZ1 were combined, there was a more notable suppression of cell growth. At the same time, the combined use of JQ1 and THZ1 synergistically suppresses the expression of HCC driver genes. Conclusion: Our research revealed that BRD4 and CDK7 coupled can be a useful biomarker in HCC prognosis and the combination of JQ1 and THZ1 can be a promising therapeutic therapy against HCC.Keywords: BRD4, CDK7, cell proliferation, combined inhibition
Procedia PDF Downloads 551657 Ambulatory Care Utilization of Individuals with Cerebral Palsy in Taiwan- A Country with Universal Coverage and No Gatekeeper Regulation
Authors: Ming-Juei Chang, Hui-Ing Ma, Tsung-Hsueh Lu
Abstract:
Introduction: Because of the advance of medical care (e.g., ventilation techniques and gastrostomy feeding), more and more children with CP can live to adulthood. However, little is known about the use of health care services from children to adults who have CP. The patterns of utilization of ambulatory care are heavily influenced by insurance coverage and primary care gatekeeper regulation. The purpose of this study was to examine patterns of ambulatory care utilization among individuals with CP in Taiwan, a country with universal coverage and no gatekeeper regulation. Methods: A representative sample of one million patients (about 1/23 of total population) covered by Taiwan’s National Health Insurance was used to analyze the ambulatory care utilization in individuals with CP. Data were analyzed by 3 different age groups (children, youth and adults) during 2000 to 2003. Participants were identified by the presence of CP diagnosis made by pediatricians or physicians of physical and rehabilitation medicine and stated at least three times in claims data. Results: Annual rates of outpatient physician visits were 31680 for children, 16492 for youth, and 28617 for adults with CP (per 1000 persons). Individuals with CP received over 50% of their outpatient care from hospital outpatient department. Higher use of specialist physician services was found in children (54.7%) than in the other two age groups (28.4% in youth and 18.8% in adults). Diseases of respiratory system were the most frequent diagnoses for visits in both children and youth with CP. Diseases of the circulatory system were the main reasons (24.3%) that adults with CP visited hospital outpatient care department or clinics. Conclusion: This study showed different patterns of ambulatory care utilization among different age groups. It appears that youth and adults with CP continue to have complex health issues and rely heavily on the health care system. Additional studies are needed to determine the factors which influence ambulatory care utilization among individuals with CP.Keywords: cerebral palsy, health services, lifespan, universal coverage
Procedia PDF Downloads 375