Search results for: volatility target
81 Transformers in Gene Expression-Based Classification
Authors: Babak Forouraghi
Abstract:
A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.Keywords: transformers, generative ai, gene expression design, classification
Procedia PDF Downloads 5980 Deep Learning for SAR Images Restoration
Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo Ferraioli
Abstract:
In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring. SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.Keywords: SAR image, polarimetric SAR image, convolutional neural network, deep learnig, deep neural network
Procedia PDF Downloads 6779 Educational Audit and Curricular Reforms in the Arabian Context
Authors: Irum Naz
Abstract:
In the Arabian higher education context, linguistic proficiency in the English language is considered crucial for the developmental sustainability, economic growth, and stability of communities and societies. Qatar’s educational reforms package, through the 2030 vision, identifies the acquisition of English at K-12 as an essential survival communication tool for globalization, believing that Qatari students need better preparation to take on the responsibilities of leadership and to participate effectively in the country’s surging economy. The idea of introducing Qatari students to modern curricula benchmarked to high-student-performance curricula in developed countries is one of the components of reformatory design principles of Education for New Era reform project that is mutually consented to and supported by the Office of Shared Services, Communications Office, and Supreme Education Council. In appreciation of the government’s vision, the English Language Centre (ELC) at the Community College of Qatar ran an internal educational audit and conducted evaluative research to understand and appraise the value, impact, and practicality of the existing ELC language development program. This study sought to identify the type of change that could identify and improve the quality of Foundation Program courses and the manners in which second language learners could be assisted to transit smoothly between (ELC) levels. Following the interpretivist paradigm and mixed research method, the data was gathered through a bicyclic research model and a triangular design. The analyses of the data suggested that there was a need for improvement in the ELC program as a whole, and particularly in terms of curriculum, student learning outcomes, and the general learning environment in the department. Key findings suggest that the target program would benefit from significant revisions, which would include narrowing the focus of the courses, providing sets of specific learning objectives, and preventing repetition between levels. Another promising finding was about the assessment tools and process. The data suggested that a set of standardized assessments that more closely suited the programs of study should be devised. It was also recommended that students undergo a more comprehensive placement process to ensure that they begin the program at an appropriate level and get the maximum benefit from their learning experience. Although this ties into the idea of curriculum revamp, it was expected that students could leave the ELC having had exposure to courses in English for specific purposes. The idea of a more reliable exit assessment for students was raised frequently so ELC could regulate itself and ensure optimum learning outcomes. Another important recommendation was the provision of a Student Learning Center for students that would help them to receive personalized tuition, differentiated instruction, and self-driven and self-evaluated learning experience. In addition, an extra study level was recommended to be added to the program to accommodate the different levels of English language proficiency represented among ELC students. The evidence collected in the course of conducting the study suggests that significant change is needed in the structure of the ELC program, specifically about curriculum, the program learning outcomes, and the learning environment in general.Keywords: educational audit, ESL, optimum learning outcomes, Qatar’s educational reforms, self-driven and self-evaluated learning experience, Student Learning Center
Procedia PDF Downloads 18578 Decision Making on Smart Energy Grid Development for Availability and Security of Supply Achievement Using Reliability Merits
Authors: F. Iberraken, R. Medjoudj, D. Aissani
Abstract:
The development of the smart grids concept is built around two separate definitions, namely: The European one oriented towards sustainable development and the American one oriented towards reliability and security of supply. In this paper, we have investigated reliability merits enabling decision-makers to provide a high quality of service. It is based on system behavior using interruptions and failures modeling and forecasting from one hand and on the contribution of information and communication technologies (ICT) to mitigate catastrophic ones such as blackouts from the other hand. It was found that this concept has been adopted by developing and emerging countries in short and medium terms followed by sustainability concept at long term planning. This work has highlighted the reliability merits such as: Benefits, opportunities, costs and risks considered as consistent units of measuring power customer satisfaction. From the decision making point of view, we have used the analytic hierarchy process (AHP) to achieve customer satisfaction, based on the reliability merits and the contribution of such energy resources. Certainly nowadays, fossil and nuclear ones are dominating energy production but great advances are already made to jump into cleaner ones. It was demonstrated that theses resources are not only environmentally but also economically and socially sustainable. The paper is organized as follows: Section one is devoted to the introduction, where an implicit review of smart grids development is given for the two main concepts (for USA and Europeans countries). The AHP method and the BOCR developments of reliability merits against power customer satisfaction are developed in section two. The benefits where expressed by the high level of availability, maintenance actions applicability and power quality. Opportunities were highlighted by the implementation of ICT in data transfer and processing, the mastering of peak demand control, the decentralization of the production and the power system management in default conditions. Costs were evaluated using cost-benefit analysis, including the investment expenditures in network security, becoming a target to hackers and terrorists, and the profits of operating as decentralized systems, with a reduced energy not supplied, thanks to the availability of storage units issued from renewable resources and to the current power lines (CPL) enabling the power dispatcher to manage optimally the load shedding. For risks, we have razed the adhesion of citizens to contribute financially to the system and to the utility restructuring. What is the degree of their agreement compared to the guarantees proposed by the managers about the information integrity? From technical point of view, have they sufficient information and knowledge to meet a smart home and a smart system? In section three, an application of AHP method is made to achieve power customer satisfaction based on the main energy resources as alternatives, using knowledge issued from a country that has a great advance in energy mutation. Results and discussions are given in section four. It was given us to conclude that the option to a given resource depends on the attitude of the decision maker (prudent, optimistic or pessimistic), and that status quo is neither sustainable nor satisfactory.Keywords: reliability, AHP, renewable energy resources, smart grids
Procedia PDF Downloads 44277 Plasma Levels of Collagen Triple Helix Repeat Containing 1 (CTHRC1) as a Potential Biomarker in Interstitial Lung Disease
Authors: Rijnbout-St.James Willem, Lindner Volkhard, Scholand Mary Beth, Ashton M. Tillett, Di Gennaro Michael Jude, Smith Silvia Enrica
Abstract:
Introduction: Fibrosing lung diseases are characterized by changes in the lung interstitium and are classified based on etiology: 1) environmental/exposure-related, 2) autoimmune-related, 3) sarcoidosis, 4) interstitial pneumonia, and 4) idiopathic. Among interstitial lung diseases (ILD) idiopathic forms, idiopathic pulmonary fibrosis (IPF) is the most severe. Pathogenesis of IPF is characterized by an increased presence of proinflammatory mediators, resulting in alveolar injury, where injury to alveolar epithelium precipitates an increase in collagen deposition, subsequently thickening the alveolar septum and decreasing gas exchange. Identifying biomarkers implicated in the pathogenesis of lung fibrosis is key to developing new therapies and improving the efficacy of existing therapies. The transforming growth factor-beta (TGF-B1), a mediator of tissue repair associated with WNT5A signaling, is partially responsible for fibroblast proliferation in ILD and is the target of Pirfenidone, one of the antifibrotic therapies used for patients with IPF. Canonical TGF-B signaling is mediated by the proteins SMAD 2/3, which are, in turn, indirectly regulated by Collagen Triple Helix Repeat Containing 1 (CTHRC1). In this study, we tested the following hypotheses: 1) CTHRC1 is more elevated in the ILD cohort compared to unaffected controls, and 2) CTHRC1 is differently expressed among ILD types. Material and Methods: CTHRC1 levels were measured by ELISA in 171 plasma samples from the deidentified University of Utah ILD cohort. Data represent a cohort of 131 ILD-affected participants and 40 unaffected controls. CTHRC1 samples were categorized by a pulmonologist based on affectation status and disease subtypes: IPF (n = 45), sarcoidosis (4), nonspecific interstitial pneumonia (16), hypersensitivity pneumonitis (n = 7), interstitial pneumonia (n=13), autoimmune (n = 15), other ILD - a category that includes undifferentiated ILD diagnoses (n = 31), and unaffected controls (n = 40). We conducted a single-factor ANOVA of plasma CTHRC1 levels to test whether CTHRC1 variance among affected and non-affected participants is statistically significantly different. In-silico analysis was performed with Ingenuity Pathway Analysis® to characterize the role of CTHRC1 in the pathway of lung fibrosis. Results: Statistical analyses of CTHRC1 in plasma samples indicate that the average CTHRC1 level is significantly higher in ILD-affected participants than controls, with the autoimmune ILD being higher than other ILD types, thus supporting our hypotheses. In-silico analyses show that CTHRC1 indirectly activates and phosphorylates SMAD3, which in turn cross-regulates TGF-B1. CTHRC1 also may regulate the expression and transcription of TGFB-1 via WNT5A and its regulatory relationship with CTNNB1. Conclusion: In-silico pathway analyses demonstrate that CTHRC1 may be an important biomarker in ILD. Analysis of plasma samples indicates that CTHRC1 expression is positively associated with ILD affectation, with autoimmune ILD having the highest average CTHRC1 values. While characterizing CTHRC1 levels in plasma can help to differentiate among ILD types and predict response to Pirfenidone, the extent to which plasma CTHRC1 level is a function of ILD severity or chronicity is unknown.Keywords: interstitial lung disease, CTHRC1, idiopathic pulmonary fibrosis, pathway analyses
Procedia PDF Downloads 19176 Deep Learning Based Polarimetric SAR Images Restoration
Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli
Abstract:
In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry
Procedia PDF Downloads 9075 We Are the Earth That Defends Itself: An Exploration of Discursive Practices of Les Soulèvements De La Terre
Authors: Sophie Del Fa, Loup Ducol
Abstract:
This presentation will focus on the discursive practices of Les Soulèvements de la Terre (hereafter SdlT), a French environmentalist group mobilized against agribusiness. More specifically, we will use, as a case study, the violently repressed demonstration that took place in Sainte-Soline on March 25, 2023 (see after for details). The SdlT embodies the renewal of anti-capitalist and environmentalist struggles that began with Occupy Wall Street in 2009 and in France with the Nuit debout in 2016 and the yellow vests movement from 2019 to 2020. These struggles have three things in common: they are self-organized without official leaders, they rely mainly on occupations to reappropriate public places (squares, roundabouts, natural territories) and they are anti-capitalist. The SdlT was created in 2021 by activists coming from the Zone-to-Defend of Notre-Dame-des-Landes, a victorious 10 yearlong occupation movement against an airport near Nantes, France (from 2009 to 2018). The SdlT is not labeled as a formal association, nor as a constituted group, but as an anti-capitalist network of local struggles at the crossroads of ecology and social issues. Indeed, although they target agro-industry, land grabbing, soil artificialization and ecology without transition, the SdlT considers ecological and social questions as interdependent. Moreover, they have an encompassing vision of ecology that they consider as a concern for the living as a whole by erasing the division between Nature and Culture. Their radicality is structured around three main elements: federative and decentralized dimensions, the rhetoric of living alliances and militant creatives strategies. The objective of this reflexion is to understand how these three dimensions are articulated through the SdlT’s discursive practices. To explore these elements, we take as a case study one specific event: the demonstration against the ‘basins’ held in Sainte-Soline on March 25, 2023, on the construction site of new water storage infrastructure for agricultural irrigation in western France. This event represents a turning point for the SdlT. Indeed, the protest was violently repressed: 5000 grenades were fired by the police, hundreds of people were injured, and one person was still in a coma at the time of writing these lines. Moreover, following Saint-Soline’s events, the Minister of Interior Affairs, Gérald Darmin, threatened to dissolve the SdlT, thus adding fuel to the fire in an already tense social climate (with the ongoing strikes against the pensions reform). We anchor our reflexion on three types of data: 1) our own experiences (inspired by ethnography) of the Sainte-Soline demonstration; 2) the collection of more than 500 000 Tweets with the #SainteSoline hashtag and 3) a press review of texts and articles published after Sainte-Soline’s demonstration. The exploration of these data from a turning point in the history of the SdlT will allow us to analyze how the three dimensions highlighted earlier (federative and decentralized dimensions, rhetoric of living alliances and creatives militant strategies) are materialized through the discursive practices surrounding the Sainte-Soline event. This will allow us to shed light on how a new contemporary movement implements contemporary environmental struggles.Keywords: discursive practices, Sainte-Soline, Ecology, radical ecology
Procedia PDF Downloads 7174 The Istrian Istrovenetian-Croatian Bilingual Corpus
Authors: Nada Poropat Jeletic, Gordana Hrzica
Abstract:
Bilingual conversational corpora represent a meaningful and the most comprehensive data source for investigating the genuine contact phenomena in non-monitored bi-lingual speech productions. They can be particularly useful for bilingual research since some features of bilingual interaction can hardly be accessed with more traditional methodologies (e.g., elicitation tasks). The method of language sampling provides the resources for describing language interaction in a bilingual community and/or in bilingual situations (e.g. code-switching, amount of languages used, number of languages used, etc.). To capture these phenomena in genuine communication situations, such sampling should be as close as possible to spontaneous communication. Bilingual spoken corpus design is methodologically demanding. Therefore this paper aims at describing the methodological challenges that apply to the corpus design of the conversational corpus design of the Istrian Istrovenetian-Croatian Bilingual Corpus. Croatian is the first official language of the Croatian-Italian officially bilingual Istria County, while Istrovenetian is a diatopic subvariety of Venetian, a longlasting lingua franca in the Istrian peninsula, the mother tongue of the members of the Italian National Community in Istria and the primary code of informal everyday communication among the Istrian Italophone population. Within the CLARIN infrastructure, TalkBank is being used, as it provides relevant procedures for designing and analyzing bilingual corpora. Furthermore, it allows public availability allows for easy replication of studies and cumulative progress as a research community builds up around the corpus, while the tools developed within the field of corpus linguistics enable easy retrieval and analysis of information. The method of language sampling employed is kept at the level of spontaneous communication, in order to maximise the naturalness of the collected conversational data. All speakers have provided written informed consent in which they agree to be recorded at a random point within the period of one month after signing the consent. Participants are administered a background questionnaire providing information about the socioeconomic status and the exposure and language usage in the participants social networks. Recording data are being transcribed, phonologically adapted within a standard-sized orthographic form, coded and segmented (speech streams are being segmented into communication units based on syntactic criteria) and are being marked following the CHAT transcription system and its associated CLAN suite of programmes within the TalkBank toolkit. The corpus consists of transcribed sound recordings of 36 bilingual speakers, while the target is to publish the whole corpus by the end of 2020, by sampling spontaneous conversations among approximately 100 speakers from all the bilingual areas of Istria for ensuring representativeness (the participants are being recruited across three generations of native bilingual speakers in all the bilingual areas of the peninsula). Conversational corpora are still rare in TalkBank, so the Corpus will contribute to BilingBank as a highly relevant and scientifically reliable resource for an internationally established and active research community. The impact of the research of communities with societal bilingualism will contribute to the growing body of research on bilingualism and multilingualism, especially regarding topics of language dominance, language attrition and loss, interference and code-switching etc.Keywords: conversational corpora, bilingual corpora, code-switching, language sampling, corpus design methodology
Procedia PDF Downloads 14573 Exploitation Pattern of Atlantic Bonito in West African Waters: Case Study of the Bonito Stock in Senegalese Waters
Authors: Ousmane Sarr
Abstract:
The Senegalese coasts have high productivity of fishery resources due to the frequency of intense up-welling system that occurs along its coast, caused by the maritime trade winds making its waters nutrients rich. Fishing plays a primordial role in Senegal's socioeconomic plans and food security. However, a global diagnosis of the Senegalese maritime fishing sector has highlighted the challenges this sector encounters. Among these concerns, some significant stocks, a priority target for artisanal fishing, need further assessment. If no efforts are made in this direction, most stock will be overexploited or even in decline. It is in this context that this research was initiated. This investigation aimed to apply a multi-modal approach (LBB, Catch-only-based CMSY model and its most recent version (CMSY++); JABBA, and JABBA-Select) to assess the stock of Atlantic bonito, Sarda sarda (Bloch, 1793) in the Senegalese Exclusive Economic Zone (SEEZ). Available catch, effort, and size data from Atlantic bonito over 15 years (2004-2018) were used to calculate the nominal and standardized CPUE, size-frequency distribution, and length at retentions (50 % and 95 % selectivity) of the species. These relevant results were employed as input parameters for stock assessment models mentioned above to define the stock status of this species in this region of the Atlantic Ocean. The LBB model indicated an Atlantic bonito healthy stock status with B/BMSY values ranging from 1.3 to 1.6 and B/B0 values varying from 0.47 to 0.61 of the main scenarios performed (BON_AFG_CL, BON_GN_Length, and BON_PS_Length). The results estimated by LBB are consistent with those obtained by CMSY. The CMSY model results demonstrate that the SEEZ Atlantic bonito stock is in a sound condition in the final year of the main scenarios analyzed (BON, BON-bt, BON-GN-bt, and BON-PS-bt) with sustainable relative stock biomass (B2018/BMSY = 1.13 to 1.3) and fishing pressure levels (F2018/FMSY= 0.52 to 1.43). The B/BMSY and F/FMSY results for the JABBA model ranged between 2.01 to 2.14 and 0.47 to 0.33, respectively. In contrast, The estimated B/BMSY and F/FMSY for JABBA-Select ranged from 1.91 to 1.92 and 0.52 to 0.54. The Kobe plots results of the base case scenarios ranged from 75% to 89% probability in the green area, indicating sustainable fishing pressure and an Atlantic bonito healthy stock size capable of producing high yields close to the MSY. Based on the stock assessment results, this study highlighted scientific advice for temporary management measures. This study suggests an improvement of the selectivity parameters of longlines and purse seines and a temporary prohibition of the use of sleeping nets in the fishery for the Atlantic bonito stock in the SEEZ based on the results of the length-base models. Although these actions are temporary, they can be essential to reduce or avoid intense pressure on the Atlantic bonito stock in the SEEZ. However, it is necessary to establish harvest control rules to provide coherent and solid scientific information that leads to appropriate decision-making for rational and sustainable exploitation of Atlantic bonito in the SEEZ and the Eastern Atlantic Ocean.Keywords: multi-model approach, stock assessment, atlantic bonito, SEEZ
Procedia PDF Downloads 6272 Exploiting the Tumour Microenvironment in Order to Optimise Sonodynamic Therapy for Cancer
Authors: Maryam Mohammad Hadi, Heather Nesbitt, Hamzah Masood, Hashim Ahmed, Mark Emberton, John Callan, Alexander MacRobert, Anthony McHale, Nikolitsa Nomikou
Abstract:
Sonodynamic therapy (SDT) utilises ultrasound in combination with sensitizers, such as porphyrins, for the production of cytotoxic reactive oxygen species (ROS) and the confined ablation of tumours. Ultrasound can be applied locally, and the acoustic waves, at frequencies between 0.5-2 MHz, are transmitted efficiently through tissue. SDT does not require highly toxic agents, and the cytotoxic effect only occurs upon ultrasound exposure at the site of the lesion. Therefore, this approach is not associated with adverse side effects. Further highlighting the benefits of SDT, no cancer cell population has shown resistance to therapy-triggered ROS production or their cytotoxic effects. This is particularly important, given the as yet unresolved issues of radiation and chemo-resistance, to the authors’ best knowledge. Another potential future benefit of this approach – considering its non-thermal mechanism of action – is its possible role as an adjuvant to immunotherapy. Substantial pre-clinical studies have demonstrated the efficacy and targeting capability of this therapeutic approach. However, SDT has yet to be fully characterised and appropriately exploited for the treatment of cancer. In this study, a formulation based on multistimulus-responsive sensitizer-containing nanoparticles that can accumulate in advanced prostate tumours and increase the therapeutic efficacy of SDT has been developed. The formulation is based on a polyglutamate-tyrosine (PGATyr) co-polymer carrying hematoporphyrin. The efficacy of SDT in this study was demonstrated using prostate cancer as the translational exemplar. The formulation was designed to respond to the microenvironment of advanced prostate tumours, such as the overexpression of the proteolytic enzymes, cathepsin-B and prostate-specific membrane antigen (PSMA), that can degrade the nanoparticles, reduce their size, improving both diffusions throughout the tumour mass and cellular uptake. The therapeutic modality was initially tested in vitro using LNCaP and PC3 cells as target cell lines. The SDT efficacy was also examined in vivo, using male SCID mice bearing LNCaP subcutaneous tumours. We have demonstrated that the PGATyr co-polymer is digested by cathepsin B and that digestion of the formulation by cathepsin-B, at tumour-mimicking conditions (acidic pH), leads to decreased nanoparticle size and subsequent increased cellular uptake. Sonodynamic treatment, at both normoxic and hypoxic conditions, demonstrated ultrasound-induced cytotoxic effects only for the nanoparticle-treated prostate cancer cells, while the toxicity of the formulation in the absence of ultrasound was minimal. Our in vivo studies in immunodeficient mice, using the hematoporphyrin-containing PGATyr nanoparticles for SDT, showed a 50% decrease in LNCaP tumour volumes within 24h, following IV administration of a single dose. No adverse effects were recorded, and body weight was stable. The results described in this study clearly demonstrate the promise of SDT to revolutionize cancer treatment. It emphasizes the potential of this therapeutic modality as a fist line treatment or in combination treatment for the elimination or downstaging of difficult to treat cancers, such as prostate, pancreatic, and advanced colorectal cancer.Keywords: sonodynamic therapy, nanoparticles, tumour ablation, ultrasound
Procedia PDF Downloads 13871 South African Breast Cancer Mutation Spectrum: Pitfalls to Copy Number Variation Detection Using Internationally Designed Multiplex Ligation-Dependent Probe Amplification and Next Generation Sequencing Panels
Authors: Jaco Oosthuizen, Nerina C. Van Der Merwe
Abstract:
The National Health Laboratory Services in Bloemfontien has been the diagnostic testing facility for 1830 patients for familial breast cancer since 1997. From the cohort, 540 were comprehensively screened using High-Resolution Melting Analysis or Next Generation Sequencing for the presence of point mutations and/or indels. Approximately 90% of these patients stil remain undiagnosed as they are BRCA1/2 negative. Multiplex ligation-dependent probe amplification was initially added to screen for copy number variation detection, but with the introduction of next generation sequencing in 2017, was substituted and is currently used as a confirmation assay. The aim was to investigate the viability of utilizing internationally designed copy number variation detection assays based on mostly European/Caucasian genomic data for use within a South African context. The multiplex ligation-dependent probe amplification technique is based on the hybridization and subsequent ligation of multiple probes to a targeted exon. The ligated probes are amplified using conventional polymerase chain reaction, followed by fragment analysis by means of capillary electrophoresis. The experimental design of the assay was performed according to the guidelines of MRC-Holland. For BRCA1 (P002-D1) and BRCA2 (P045-B3), both multiplex assays were validated, and results were confirmed using a secondary probe set for each gene. The next generation sequencing technique is based on target amplification via multiplex polymerase chain reaction, where after the amplicons are sequenced parallel on a semiconductor chip. Amplified read counts are visualized as relative copy numbers to determine the median of the absolute values of all pairwise differences. Various experimental parameters such as DNA quality, quantity, and signal intensity or read depth were verified using positive and negative patients previously tested internationally. DNA quality and quantity proved to be the critical factors during the verification of both assays. The quantity influenced the relative copy number frequency directly whereas the quality of the DNA and its salt concentration influenced denaturation consistency in both assays. Multiplex ligation-dependent probe amplification produced false positives due to ligation failure when ligation was inhibited due to a variant present within the ligation site. Next generation sequencing produced false positives due to read dropout when primer sequences did not meet optimal multiplex binding kinetics due to population variants in the primer binding site. The analytical sensitivity and specificity for the South African population have been proven. Verification resulted in repeatable reactions with regards to the detection of relative copy number differences. Both multiplex ligation-dependent probe amplification and next generation sequencing multiplex panels need to be optimized to accommodate South African polymorphisms present within the genetically diverse ethnic groups to reduce the false copy number variation positive rate and increase performance efficiency.Keywords: familial breast cancer, multiplex ligation-dependent probe amplification, next generation sequencing, South Africa
Procedia PDF Downloads 23170 Trends in Blood Pressure Control and Associated Risk Factors Among US Adults with Hypertension from 2013 to 2020: Insights from NHANES Data
Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei
Abstract:
Controlling blood pressure is critical to reducing the risk of cardiovascular disease. However, BP control rates (systolic BP < 140 mm Hg and diastolic BP < 90 mm Hg) have declined since 2013, warranting further analysis to identify contributing factors and potential interventions. This study investigates the factors associated with the decline in blood pressure (BP) control among U.S. adults with hypertension over the past decade. Data from the U.S. National Health and Nutrition Examination Survey (NHANES) were used to assess BP control trends between 2013 and 2020. The analysis included 18,927 U.S. adults with hypertension aged 18 years and older who completed study interviews and examinations. The dataset, obtained from the cardioStatsUSA and RNHANES R packages, was merged based on survey IDs. Key variables analyzed included demographic factors, lifestyle behaviors, hypertension status, BMI, comorbidities, antihypertensive medication use, and cardiovascular disease history. The prevalence of BP control declined from 78.0% in 2013-2014 to 71.6% in 2017-2020. Non-Hispanic Whites had the highest BP control prevalence (33.6% in 2013-2014), but this declined to 26.5% by 2017-2020. In contrast, BP control among Non-Hispanic Blacks increased slightly. Younger adults (aged 18-44) exhibited better BP control, but control rates declined over time. Obesity prevalence increased, contributing to poorer BP control. Antihypertensive medication use rose from 26.1% to 29.2% across the study period. Lifestyle behaviors, such as smoking and diet, also affected BP control, with nonsmokers and those with better diets showing higher control rates. Key findings indicate significant disparities in blood pressure control across racial/ethnic groups. Non-Hispanic Black participants had consistently higher odds (OR ranging from 1.84 to 2.33) of poor blood pressure control compared to Non-Hispanic Whites, while odds among Non-Hispanic Asians varied by cycle. Younger age groups (18-44 and 45-64) showed significantly lower odds of poor blood pressure control compared to those aged 75+, highlighting better control in younger populations. Men had consistently higher odds of poor control compared to women, though this disparity slightly decreased in 2017-2020. Medical comorbidities such as diabetes and chronic kidney disease were associated with significantly higher odds of poor blood pressure control across all cycles. Participants with chronic kidney disease had particularly elevated odds (OR=5.54 in 2015-2016), underscoring the challenge of managing hypertension in these populations. Antihypertensive medication use was also linked with higher odds of poor control, suggesting potential difficulties in achieving target blood pressure despite treatment. Lifestyle factors such as alcohol consumption and physical activity showed no consistent association with blood pressure control. However, dietary quality appeared protective, with those reporting an excellent diet showing lower odds (OR=0.64) of poor control in the overall sample. Increased BMI was associated with higher odds of poor blood pressure control, particularly in the 30-35 and 35+ BMI categories during 2015-2016. The study highlights a significant decline in BP control among U.S. adults with hypertension, particularly among certain demographic groups and those with increasing obesity rates. Lifestyle behaviors, antihypertensive medication use, and socioeconomic factors all played a role in these trends.Keywords: diabetes, blood pressure, obesity, logistic regression, odd ratio
Procedia PDF Downloads 969 The Regulation of the Cancer Epigenetic Landscape Lies in the Realm of the Long Non-coding RNAs
Authors: Ricardo Alberto Chiong Zevallos, Eduardo Moraes Rego Reis
Abstract:
Pancreatic adenocarcinoma (PDAC) patients have a less than 10% 5-year survival rate. PDAC has no defined diagnostic and prognostic biomarkers. Gemcitabine is the first-line drug in PDAC and several other cancers. Long non-coding RNAs (lncRNAs) contribute to the tumorigenesis and are potential biomarkers for PDAC. Although lncRNAs aren’t translated into proteins, they have important functions. LncRNAs can decoy or recruit proteins from the epigenetic machinery, act as microRNA sponges, participate in protein translocation through different cellular compartments, and even promote chemoresistance. The chromatin remodeling enzyme EZH2 is a histone methyltransferase that catalyzes the methylation of histone 3 at lysine 27, silencing local expression. EZH2 is ambivalent, it can also activate gene expression independently of its histone methyltransferase activity. EZH2 is overexpressed in several cancers and interacts with lncRNAs, being recruited to a specific locus. EZH2 can be recruited to activate an oncogene or silence a tumor suppressor. The lncRNAs misregulation in cancer can result in the differential recruitment of EZH2 and in a distinct epigenetic landscape, promoting chemoresistance. The relevance of the EZH2-lncRNAs interaction to chemoresistant PDAC was assessed by Real Time quantitative PCR (RT-qPCR) and RNA Immunoprecipitation (RIP) experiments with naïve and gemcitabine-resistant PDAC cells. The expression of several lncRNAs and EZH2 gene targets was evaluated contrasting naïve and resistant cells. Selection of candidate genes was made by bioinformatic analysis and literature curation. Indeed, the resistant cell line showed higher expression of chemoresistant-associated lncRNAs and protein coding genes. RIP detected lncRNAs interacting with EZH2 with varying intensity levels in the cell lines. During RIP, the nuclear fraction of the cells was incubated with an antibody for EZH2 and with magnetic beads. The RNA precipitated with the beads-antibody-EZH2 complex was isolated and reverse transcribed. The presence of candidate lncRNAs was detected by RT-qPCR, and the enrichment was calculated relative to INPUT (total lysate control sample collected before RIP). The enrichment levels varied across the several lncRNAs and cell lines. The EZH2-lncRNA interaction might be responsible for the regulation of chemoresistance-associated genes in multiple cancers. The relevance of the lncRNA-EZH2 interaction to PDAC was assessed by siRNA knockdown of a lncRNA, followed by the analysis of the EZH2 target expression by RT-qPCR. The chromatin immunoprecipitation (ChIP) of EZH2 and H3K27me3 followed by RT-qPCR with primers for EZH2 targets also assess the specificity of the EZH2 recruitment by the lncRNA. This is the first report of the interaction of EZH2 and lncRNAs HOTTIP and PVT1 in chemoresistant PDAC. HOTTIP and PVT1 were described as promoting chemoresistance in several cancers, but the role of EZH2 is not clarified. For the first time, the lncRNA LINC01133 was detected in a chemoresistant cancer. The interaction of EZH2 with LINC02577, LINC00920, LINC00941, and LINC01559 have never been reported in any context. The novel lncRNAs-EZH2 interactions regulate chemoresistant-associated genes in PDAC and might be relevant to other cancers. Therapies targeting EZH2 alone weren’t successful, and a combinatorial approach also targeting the lncRNAs interacting with it might be key to overcome chemoresistance in several cancers.Keywords: epigenetics, chemoresistance, long non-coding RNAs, pancreatic cancer, histone modification
Procedia PDF Downloads 9668 Predictive Analytics for Theory Building
Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim
Abstract:
Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building
Procedia PDF Downloads 27667 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification
Authors: Babak Forouraghi
Abstract:
A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers
Procedia PDF Downloads 6166 Intercultural Initiatives and Canadian Bilingualism
Authors: Muna Shafiq
Abstract:
Growth in international immigration is a reflection of increased migration patterns in Canada and in other parts of the world. Canada continues to promote itself as a bilingual country, yet the bilingual French and English population numbers do not reflect this platform. Each province’s integration policies focus only on second language learning of either English or French. Moreover, since English Canadians outnumber French Canadians, maintaining, much less increasing, English-French bilingualism appears unrealistic. One solution to increasing Canadian bilingualism requires creating intercultural communication initiatives between youth in Quebec and the rest of Canada. Specifically, the focus is on active, experiential learning, where intercultural competencies develop outside traditional classroom settings. The target groups are Generation Y Millennials and Generation Z Linksters, the next generations in the career and parenthood lines. Today, Canada’s education system, like many others, must continually renegotiate lines between programs it offers its immigrant and native communities. While some purists or right-wing nationalists would disagree, the survival of bilingualism in Canada has little to do with reducing immigration. Children and youth immigrants play a valuable role in increasing Canada’s French and English speaking communities. For instance, a focus on more immersion, over core French education programs for immigrant children and youth would not only increase bilingual rates; it would develop meaningful intercultural attachments between Canadians. Moreover, a vigilant increase of funding in French immersion programs is critical, as are new initiatives that focus on experiential language learning for students in French and English language programs. A favorable argument supports the premise that other than French-speaking students in Québec and elsewhere in Canada, second and third generation immigrant students are excellent ambassadors to promote bilingualism in Canada. Most already speak another language at home and understand the value of speaking more than one language in their adopted communities. Their dialogue and participation in experiential language exchange workshops are necessary. If the proposed exchanges take place inter-provincially, the momentum to increase collective regional voices increases. This regional collectivity can unite Canadians differently than nation-targeted initiatives. The results from an experiential youth exchange organized in 2017 between students at the crossroads of Generation Y and Generation Z in Vancouver and Quebec City respectively offer a promising starting point in assessing the strength of bringing together different regional voices to promote bilingualism. Code-switching between standard, international French Vancouver students, learn in the classroom versus more regional forms of Quebec French spoken locally created regional connectivity between students. The exchange was equally rewarding for both groups. Increasing their appreciation for each other’s regional differences allowed them to contribute actively to their social and emotional development. Within a sociolinguistic frame, this proposed model of experiential learning does not focus on hands-on work experience. However, the benefits of such exchanges are as valuable as work experience initiatives developed in experiential education. Students who actively code switch between French and English in real, not simulated contexts appreciate bilingualism more meaningfully and experience its value in concrete terms.Keywords: experiential learning, intercultural communication, social and emotional learning, sociolinguistic code-switching
Procedia PDF Downloads 13865 Emotional State and Cognitive Workload during a Flight Simulation: Heart Rate Study
Authors: Damien Mouratille, Antonio R. Hidalgo-Muñoz, Nadine Matton, Yves Rouillard, Mickael Causse, Radouane El Yagoubi
Abstract:
Background: The monitoring of the physiological activity related to mental workload (MW) on pilots will be useful to improve aviation safety by anticipating human performance degradation. The electrocardiogram (ECG) can reveal MW fluctuations due to either cognitive workload or/and emotional state since this measure exhibits autonomic nervous system modulations. Arguably, heart rate (HR) is one of its most intuitive and reliable parameters. It would be particularly interesting to analyze the interaction between cognitive requirements and emotion in ecologic sets such as a flight simulator. This study aims to explore by means of HR the relation between cognitive demands and emotional activation. Presumably, the effects of cognition and emotion overloads are not necessarily cumulative. Methodology: Eight healthy volunteers in possession of the Private Pilot License were recruited (male; 20.8±3.2 years). ECG signal was recorded along the whole experiment by placing two electrodes on the clavicle and left pectoral of the participants. The HR was computed within 4 minutes segments. NASA-TLX and Big Five inventories were used to assess subjective workload and to consider the influence of individual personality differences. The experiment consisted in completing two dual-tasks of approximately 30 minutes of duration into a flight simulator AL50. Each dual-task required the simultaneous accomplishment of both a pre-established flight plan and an additional task based on target stimulus discrimination inserted between Air Traffic Control instructions. This secondary task allowed us to vary the cognitive workload from low (LC) to high (HC) levels, by combining auditory and visual numerical stimuli to respond to meeting specific criteria. Regarding emotional condition, the two dual-tasks were designed to assure analogous difficulty in terms of solicited cognitive demands. The former was realized by the pilot alone, i.e. Low Arousal (LA) condition. In contrast, the latter generates a high arousal (HA), since the pilot was supervised by two evaluators, filmed and involved into a mock competition with the rest of the participants. Results: Performance for the secondary task showed significant faster reaction times (RT) for HA compared to LA condition (p=.003). Moreover, faster RT was found for LC compared to HC (p < .001) condition. No interaction was found. Concerning HR measure, despite the lack of main effects an interaction between emotion and cognition is evidenced (p=.028). Post hoc analysis showed smaller HR for HA compared to LA condition only for LC (p=.049). Conclusion. The control of an aircraft is a very complex task including strong cognitive demands and depends on the emotional state of pilots. According to the behavioral data, the experimental set has permitted to generate satisfactorily different emotional and cognitive levels. As suggested by the interaction found in HR measure, these two factors do not seem to have a cumulative impact on the sympathetic nervous system. Apparently, low cognitive workload makes pilots more sensitive to emotional variations. These results hint the independency between data processing and emotional regulation. Further physiological data are necessary to confirm and disentangle this relation. This procedure may be useful for monitoring objectively pilot’s mental workload.Keywords: cognitive demands, emotion, flight simulator, heart rate, mental workload
Procedia PDF Downloads 27564 Effectiveness of Essential Oils as Inhibitors of Quorum Sensing Activity Using Biomonitor Strain Chromobacterium Violaceum
Authors: Ivana Cabarkapa, Zorica Tomicic, Olivera Duragic
Abstract:
Antimicrobial resistance represents one of the major challenges facing humanity in the last decades. Increasing antibiotic-resistant pathogens indicates the need for the development of alternative antibacterial drugs and new treatment strategies. One of the innovative emerging treatments in overcoming multidrug-resistant pathogens certainly represents the inhibition anti-quorum sensing system. For most of the food-borne pathogens, the expression of the virulence depends on their capability communication with other members of the population by means of quorum sensing (QS). QS represents a specific way of bacterial intercellular communication, which enabled owing to their ability to detect and to respond to cell population density by gene regulation. QS mechanisms are responsible for controls the pathogenesis, virulence luminescence, motility, sporulation and biofilm formation of many organisms by regulating gene expression. Therefore, research in this field is being an attractive target for the development of new natural antibacterial agents. Anti-QS compounds are known to have the ability to prohibit bacterial pathogenicity. Considering the importance of quorum sensing during bacterial pathogenesis, this research has been focused on evaluation anti - QS properties of four essential oils (EOs) Origanum heracleoticum, Origanum vulgare, Thymus vulgare, and Thymus serpyllum, using biomonitor strain of Chromobacterium violaceum CV026. Tests conducted on Luria Bertani agar supplemented with N hexanol DL homoserine lacton (HHL) 10µl/50ml of agar. The anti-QS potential of the EOs was assayed in a range of concentrations of 200 – 0.39 µl/ml using the disc diffusion method. EOs of Th. vulgaris and T. serpyllum were exhibited anti-QS activity indicated by a non- pigmented ring with a dilution-dependent manner. The lowest dilution of EOs T. vulgaris and T. serpyllum in which they exhibited visually detectable inhibition of violacein synthesis was 6.25 µl/ml for both tested EOs. EOs of O. heracleoticum and O. vulgare were displayed different active principles, i.e., antimicrobial activity indicated by the inner clear ring and anti-QS activity indicated by the outer non-pigmented ring, in a concentration-dependent manner. The lowest dilution of EOs of O. heracleoticum and O. vulgare in which exhibited visually detectable inhibition of violacein synthesis was 1.56 and 3.25 µl/ml, respectively. Considering that, the main constituents of the tested EOs represented by monoterpenes (carvacrol, thymol, γ-terpinene, and p-cymene), anti - QS properties of tested EOs can be mainly attributed to their activity. In particular, from the scientific literature, carvacrol and thymol show a sub-inhibitory effect against foodborne pathogens. Previous studies indicated that sub-lethal concentrations of carvacrol reduced the mobility of bacteria due to the ability of interference using QS mechanism between the bacterial cells, and thereby reducing the ability of biofilm formation The precise mechanism by which carvacrol inhibits biofilm formation is still not fully understood. Our results indicated that EOs displayed different active principles, i.e., antimicrobial activity indicated by the inner clear ring and anti-QS activity indicated by an outer non- pigmented ring with visually detectable inhibition of violacein. Preliminary results suggest that EOs represent a promising alternative for effective control of the emergence and spread of resistant pathogens.Keywords: anti-quorum sensing activity, Chromobacterium violaceum, essential oils, violacein
Procedia PDF Downloads 13863 Digitization and Morphometric Characterization of Botanical Collection of Indian Arid Zones as Informatics Initiatives Addressing Conservation Issues in Climate Change Scenario
Authors: Dipankar Saha, J. P. Singh, C. B. Pandey
Abstract:
Indian Thar desert being the seventh largest in the world is the main hot sand desert occupies nearly 385,000km2 and about 9% of the area of the country harbours several species likely the flora of 682 species (63 introduced species) belonging to 352 genera and 87 families. The degree of endemism of plant species in the Thar desert is 6.4 percent, which is relatively higher than the degree of endemism in the Sahara desert which is very significant for the conservationist to envisage. The advent and development of computer technology for digitization and data base management coupled with the rapidly increasing importance of biodiversity conservation resulted in the invention of biodiversity informatics as discipline of basic sciences with multiple applications. Aichi Target 19 as an outcome of Convention of Biological Diversity (CBD) specifically mandates the development of an advanced and shared biodiversity knowledge base. Information on species distributions in space is the crux of effective management of biodiversity in the rapidly changing world. The efficiency of biodiversity management is being increased rapidly by various stakeholders like researchers, policymakers, and funding agencies with the knowledge and application of biodiversity informatics. Herbarium specimens being a vital repository for biodiversity conservation especially in climate change scenario the digitization process usually aims to improve access and to preserve delicate specimens and in doing so creating large sets of images as a part of the existing repository as arid plant information facility for long-term future usage. As the leaf characters are important for describing taxa and distinguishing between them and they can be measured from herbarium specimens as well. As a part of this activity, laminar characterization (leaves being the most important characters in assessing climate change impact) initially resulted in classification of more than thousands collections belonging to ten families like Acanthaceae, Aizoaceae, Amaranthaceae, Asclepiadaceae, Anacardeaceae, Apocynaceae, Asteraceae, Aristolochiaceae, Berseraceae and Bignoniaceae etc. Taxonomic diversity indices has also been worked out being one of the important domain of biodiversity informatics approaches. The digitization process also encompasses workflows which incorporate automated systems to enable us to expand and speed up the digitisation process. The digitisation workflows used to be on a modular system which has the potential to be scaled up. As they are being developed with a geo-referencing tool and additional quality control elements and finally placing specimen images and data into a fully searchable, web-accessible database. Our effort in this paper is to elucidate the role of BIs, present effort of database development of the existing botanical collection of institute repository. This effort is expected to be considered as a part of various global initiatives having an effective biodiversity information facility. This will enable access to plant biodiversity data that are fit-for-use by scientists and decision makers working on biodiversity conservation and sustainable development in the region and iso-climatic situation of the world.Keywords: biodiversity informatics, climate change, digitization, herbarium, laminar characters, web accessible interface
Procedia PDF Downloads 22962 Official Game Account Analysis: Factors Influence Users' Judgments in Limited-Word Posts
Authors: Shanhua Hu
Abstract:
Social media as a critical propagandizing form of film, video games, and digital products has received substantial research attention, but there exists several critical barriers such as: (1) few studies exploring the internal and external connections of a product as part of the multimodal context that gives rise to readability and commercial return; (2) the lack of study of multimodal analysis in product’s official account of game publishers and its impact on users’ behaviors including purchase intention, social media engagement, and playing time; (3) no standardized ecologically-valid, game type-varying data can be used to study the complexity of official account’s postings within a time period. This proposed research helps to tackle these limitations in order to develop a model of readability study that is more ecologically valid, robust, and thorough. To accomplish this objective, this paper provides a more diverse dataset comprising different visual elements and messages collected from the official Twitter accounts of the Top 20 best-selling games of 2021. Video game companies target potential users through social media, a popular approach is to set up an official account to maintain exposure. Typically, major game publishers would create an official account on Twitter months before the game's release date to update on the game's development, announce collaborations, and reveal spoilers. Analyses of tweets from those official Twitter accounts would assist publishers and marketers in identifying how to efficiently and precisely deploy advertising to increase game sales. The purpose of this research is to determine how official game accounts use Twitter to attract new customers, specifically which types of messages are most effective at increasing sales. The dataset includes the number of days until the actual release date on Twitter posts, the readability of the post (Flesch Reading Ease Score, FRES), the number of emojis used, the number of hashtags, the number of followers of the mentioned users, the categorization of the posts (i.e., spoilers, collaborations, promotions), and the number of video views. The timeline of Twitter postings from official accounts will be compared to the history of pre-orders and sales figures to determine the potential impact of social media posts. This study aims to determine how the above-mentioned characteristics of official accounts' Twitter postings influence the sales of the game and to examine the possible causes of this influence. The outcome will provide researchers with a list of potential aspects that could influence people's judgments in limited-word posts. With the increased average online time, users would adapt more quickly than before in online information exchange and readings, such as the word to use sentence length, and the use of emojis or hashtags. The study on the promotion of official game accounts will not only enable publishers to create more effective promotion techniques in the future but also provide ideas for future research on the influence of social media posts with a limited number of words on consumers' purchasing decisions. Future research can focus on more specific linguistic aspects, such as precise word choice in advertising.Keywords: engagement, official account, promotion, twitter, video game
Procedia PDF Downloads 7661 Differential Expression Profile Analysis of DNA Repair Genes in Mycobacterium Leprae by qPCR
Authors: Mukul Sharma, Madhusmita Das, Sundeep Chaitanya Vedithi
Abstract:
Leprosy is a chronic human disease caused by Mycobacterium leprae, that cannot be cultured in vitro. Though treatable with multidrug therapy (MDT), recently, bacteria reported resistance to multiple antibiotics. Targeting DNA replication and repair pathways can serve as the foundation of developing new anti-leprosy drugs. Due to the absence of an axenic culture medium for the propagation of M. leprae, studying cellular processes, especially those belonging to DNA repair pathways, is challenging. Genomic understanding of M. Leprae harbors several protein-coding genes with no previously assigned function known as 'hypothetical proteins'. Here, we report identification and expression of known and hypothetical DNA repair genes from a human skin biopsy and mouse footpads that are involved in base excision repair, direct reversal repair, and SOS response. Initially, a bioinformatics approach was employed based on sequence similarity, identification of known protein domains to screen the hypothetical proteins in the genome of M. leprae, that are potentially related to DNA repair mechanisms. Before testing on clinical samples, pure stocks of bacterial reference DNA of M. leprae (NHDP63 strain) was used to construct standard graphs to validate and identify lower detection limit in the qPCR experiments. Primers were designed to amplify the respective transcripts, and PCR products of the predicted size were obtained. Later, excisional skin biopsies of newly diagnosed untreated, treated, and drug resistance leprosy cases from SIHR & LC hospital, Vellore, India were taken for the extraction of RNA. To determine the presence of the predicted transcripts, cDNA was generated from M. leprae mRNA isolated from clinically confirmed leprosy skin biopsy specimen across all the study groups. Melting curve analysis was performed to determine the integrity of the amplification and to rule out primer‑dimer formation. The Ct values obtained from qPCR were fitted to standard curve to determine transcript copy number. Same procedure was applied for M. leprae extracted after processing a footpad of nude mice of drug sensitive and drug resistant strains. 16S rRNA was used as positive control. Of all the 16 genes involved in BER, DR, and SOS, differential expression pattern of the genes was observed in terms of Ct values when compared to human samples; this was because of the different host and its immune response. However, no drastic variation in gene expression levels was observed in human samples except the nth gene. The higher expression of nth gene could be because of the mutations that may be associated with sequence diversity and drug resistance which suggests an important role in the repair mechanism and remains to be explored. In both human and mouse samples, SOS system – lexA and RecA, and BER genes AlkB and Ogt were expressing efficiently to deal with possible DNA damage. Together, the results of the present study suggest that DNA repair genes are constitutively expressed and may provide a reference for molecular diagnosis, therapeutic target selection, determination of treatment and prognostic judgment in M. leprae pathogenesis.Keywords: DNA repair, human biopsy, hypothetical proteins, mouse footpads, Mycobacterium leprae, qPCR
Procedia PDF Downloads 10360 A Randomized, Controlled Trial To Test Behavior Change Techniques (BCTS) To Improve Low Intensity Physical Activity In Older Adults
Authors: Ciaran Friel, Jerry Suls, Patrick Robles, Frank Vicari, Joan Duer-Hefele, Karina W. Davidson
Abstract:
Physical activity guidelines focus on increasing moderate intensity activity for older adults, but adherence to recommendations remains low. This is despite the fact that scientific evidence supports that any increase in physical activity is positively correlated with health benefits. Behavior change techniques (BCTs) have demonstrated effectiveness in reducing sedentary behavior and promoting physical activity. This pilot study uses a Personalized Trials (N-of-1) design to evaluate the efficacy of using four BCTs to promote an increase in low-intensity physical activity (2,000 steps of walking per day) in adults aged 45-75 years old. The 4 BCTs tested were goal setting, action planning, feedback, and self-monitoring. BCTs were tested in random order and delivered by text message prompts requiring participant response. The study recruited health system employees in the target age range, without mobility restrictions and demonstrating interest in increasing their daily activity by a minimum of 2,000 steps per day for a minimum of five days per week. Participants were sent a Fitbit Charge 4 fitness tracker with an established study account and password. Participants were recommended to wear the Fitbit device 24/7, but were required to wear it for a minimum of ten hours per day. Baseline physical activity was measured by the Fitbit for two weeks. Participants then engaged with a clinical research coordinator to review comprehension of the text message content and required actions for each of the BCTs to be tested. Participants then selected a consistent daily time in which they would receive their text message prompt. In the 8 week intervention phase of the study, participants received each of the four BCTs, in random order, for a two week period. Text message prompts were delivered daily at a time selected by the participant. All prompts required an interactive response from participants and may have included recording their detailed plan for walking or daily step goal (action planning, goal setting). Additionally, participants may have been directed to a study dashboard to view their step counts or compare themselves with peers (self-monitoring, feedback). At the end of each two week testing interval, participants were asked to complete the Self-Efficacy for Walking Scale (SEW_Dur), a validated measure that assesses the participant’s confidence in walking incremental distances and a survey measuring their satisfaction with the individual BCT that they tested. At the end of their trial, participants received a personalized summary of their step data in response to each individual BCT. Analysis will examine the novel individual-level heterogeneity of treatment effect made possible by N-of-1 design, and pool results across participants to efficiently estimate the overall efficacy of the selected behavioral change techniques in increasing low-intensity walking by 2,000 steps, 5 days per week. Self-efficacy will be explored as the likely mechanism of action prompting behavior change. This study will inform the providers and demonstrate the feasibility of N-of-1 study design to effectively promote physical activity as a component of healthy aging.Keywords: aging, exercise, habit, walking
Procedia PDF Downloads 12959 An Analysis of Economical Drivers and Technical Challenges for Large-Scale Biohydrogen Deployment
Authors: Rouzbeh Jafari, Joe Nava
Abstract:
This study includes learnings from an engineering practice normally performed on large scale biohydrogen processes. If properly scale-up is done, biohydrogen can be a reliable pathway for biowaste valorization. Most of the studies on biohydrogen process development have used model feedstock to investigate process key performance indicators (KPIs). This study does not intend to compare different technologies with model feedstock. However, it reports economic drivers and technical challenges which help in developing a road map for expanding biohydrogen economy deployment in Canada. BBA is a consulting firm responsible for the design of hydrogen production projects. Through executing these projects, activity has been performed to identify, register and mitigate technical drawbacks of large-scale hydrogen production. Those learnings, in this study, have been applied to the biohydrogen process. Through data collected by a comprehensive literature review, a base case has been considered as a reference, and several case studies have been performed. Critical parameters of the process were identified and through common engineering practice (process design, simulation, cost estimate, and life cycle assessment) impact of these parameters on the commercialization risk matrix and class 5 cost estimations were reported. The process considered in this study is food waste and woody biomass dark fermentation. To propose a reliable road map to develop a sustainable biohydrogen production process impact of critical parameters was studied on the end-to-end process. These parameters were 1) feedstock composition, 2) feedstock pre-treatment, 3) unit operation selection, and 4) multi-product concept. A couple of emerging technologies also were assessed such as photo-fermentation, integrated dark fermentation, and using ultrasound and microwave to break-down feedstock`s complex matrix and increase overall hydrogen yield. To properly report the impact of each parameter KPIs were identified as 1) Hydrogen yield, 2) energy consumption, 3) secondary waste generated, 4) CO2 footprint, 5) Product profile, 6) $/kg-H2 and 5) environmental impact. The feedstock is the main parameter defining the economic viability of biohydrogen production. Through parametric studies, it was found that biohydrogen production favors feedstock with higher carbohydrates. The feedstock composition was varied, by increasing one critical element (such as carbohydrate) and monitoring KPIs evolution. Different cases were studied with diverse feedstock, such as energy crops, wastewater slug, and lignocellulosic waste. The base case process was applied to have reference KPIs values and modifications such as pretreatment and feedstock mix-and-match were implemented to investigate KPIs changes. The complexity of the feedstock is the main bottleneck in the successful commercial deployment of the biohydrogen process as a reliable pathway for waste valorization. Hydrogen yield, reaction kinetics, and performance of key unit operations highly impacted as feedstock composition fluctuates during the lifetime of the process or from one case to another. In this case, concept of multi-product becomes more reliable. In this concept, the process is not designed to produce only one target product such as biohydrogen but will have two or multiple products (biohydrogen and biomethane or biochemicals). This new approach is being investigated by the BBA team and the results will be shared in another scientific contribution.Keywords: biohydrogen, process scale-up, economic evaluation, commercialization uncertainties, hydrogen economy
Procedia PDF Downloads 11058 Targeting Violent Extremist Narratives: Applying Network Targeting Techniques to the Communication Functions of Terrorist Groups
Authors: John Hardy
Abstract:
Over the last decade, the increasing utility of extremist narratives to the operational effectiveness of terrorist organizations has been evidenced by the proliferation of inspired or affiliated attacks across the world. Famous examples such as regional al-Qaeda affiliates and the self-styled “Islamic State” demonstrate the effectiveness of leveraging communication technologies to disseminate propaganda, recruit members, and orchestrate attacks. Terrorist organizations with the capacity to harness the communicative power offered by digital communication technologies and effective political narratives have held an advantage over their targets in recent years. Terrorists have leveraged the perceived legitimacy of grass-roots actors to appeal to a global audience of potential supporters and enemies alike, and have wielded a proficiency in profile-raising which remains unmatched by counter terrorism narratives around the world. In contrast, many attempts at propagating official counter-narratives have been received by target audiences as illegitimate, top-down and impersonally bureaucratic. However, the benefits provided by widespread communication and extremist narratives have come at an operational cost. Terrorist organizations now face a significant challenge in protecting their access to communications technologies and authority over the content they create and endorse. The dissemination of effective narratives has emerged as a core function of terrorist organizations with international reach via inspired or affiliated attacks. As such, it has become a critical function which can be targeted by intelligence and security forces. This study applies network targeting principles which have been used by coalition forces against a range of non-state actors in the Middle East and South Asia to the communicative function of terrorist organizations. This illustrates both a conceptual link between functional targeting and operational disruption in the abstract and a tangible impact on the operational effectiveness of terrorists by degrading communicative ability and legitimacy. Two case studies highlight the utility of applying functional targeting against terrorist organizations. The first case is the targeted killing of Anwar al-Awlaki, an al-Qaeda propagandist who crafted a permissive narrative and effective propaganda videos to attract recruits who committed inspired terrorist attacks in the US and overseas. The second is a series of operations against Islamic State propagandists in Syria, including the capture or deaths of a cadre of high profile Islamic State members, including Junaid Hussain, Abu Mohammad al-Adnani, Neil Prakash, and Rachid Kassim. The group of Islamic State propagandists were linked to a significant rise in affiliated and enabled terrorist attacks and were subsequently targeted by law enforcement and military agencies. In both cases, the disruption of communication between the terrorist organization and recruits degraded both communicative and operational functions. Effective functional targeting on member recruitment and operational tempo suggests that narratives are a critical function which can be leveraged against terrorist organizations. Further application of network targeting methods to terrorist narratives may enhance the efficacy of a range of counter terrorism techniques employed by security and intelligence agencies.Keywords: countering violent extremism, counter terrorism, intelligence, terrorism, violent extremism
Procedia PDF Downloads 29157 SWOT Analysis on the Prospects of Carob Use in Human Nutrition: Crete, Greece
Authors: Georgios A. Fragkiadakis, Antonia Psaroudaki, Theodora Mouratidou, Eirini Sfakianaki
Abstract:
Research: Within the project "Actions for the optimal utilization of the potential of carob in the Region of Crete" which is financed-supervised by the Region, with collaboration of Crete University and Hellenic Mediterranean University, a SWOT (strengths, weaknesses, opportunities, threats) survey was carried out, to evaluate the prospects of carob in human nutrition, in Crete. Results and conclusions: 1). Strengths: There exists a local production of carob for human consumption, based on international reports, and local-product reports. The data on products in the market (over 100 brands of carob food), indicates a sufficiency of carob materials offered in Crete. The variety of carob food products retailed in Crete indicates a strong demand-production-consumption trend. There is a stable number (core) of businesses that invest significantly (Creta carob, Cretan mills, etc.). The great majority of the relevant food stores (bakery, confectionary etc.) do offer carob products. The presence of carob products produced in Crete is strong on the internet (over 20 main professionally designed websites). The promotion of the carob food-products is based on their variety and on a few historical elements connected with the Cretan diet. 2). Weaknesses: The international prices for carob seed affect the sector; the seed had an international price of €20 per kg in 2021-22 and fell to €8 in 2022, causing losses to carob traders. The local producers do not sort the carobs they deliver for processing, causing 30-40% losses of the product in the industry. The occasional high price triggers the collection of degraded raw material; large losses may emerge due to the action of insects. There are many carob trees whose fruits are not collected, e.g. in Apokoronas, Chania. The nutritional and commercial value of the wild carob fruits is very low. Carob trees-production is recorded by Greek statistical services as "other cultures" in combination with prickly pear i.e., creating difficulties in retrieving data. The percentage of carob used for human nutrition, in contrast to animal feeding, is not known. The exact imports of carob are not closely monitored. We have no data on the recycling of carob by-products in Crete. 3). Opportunities: The development of a culture of respect for carob trade may improve professional relations in the sector. Monitoring carob market and connecting production with retailing-industry needs may allow better market-stability. Raw material evaluation procedures may be implemented to maintain carob value-chain. The state agricultural services may be further involved in carob-health protection. The education of farmers on carob cultivation/management, can improve the quality of the product. The selection of local productive varieties, may improve the sustainability of the culture. Connecting the consumption of carob with health-food products, may create added value in the sector. The presence and extent of wild carob threes in Crete, represents, potentially, a target for grafting. 4). Threats: The annual fluctuation of carob yield challenges the programming of local food industry activities. Carob is a forest species also - there is danger of wrong classification of crops as forest areas, where land ownership is not clear.Keywords: human nutrition, carob food, SWOT analysis, crete, greece
Procedia PDF Downloads 9256 Fueling Efficient Reporting And Decision-Making In Public Health With Large Data Automation In Remote Areas, Neno Malawi
Authors: Wiseman Emmanuel Nkhomah, Chiyembekezo Kachimanga, Julia Huggins, Fabien Munyaneza
Abstract:
Background: Partners In Health – Malawi introduced one of Operational Researches called Primary Health Care (PHC) Surveys in 2020, which seeks to assess progress of delivery of care in the district. The study consists of 5 long surveys, namely; Facility assessment, General Patient, Provider, Sick Child, Antenatal Care (ANC), primarily conducted in 4 health facilities in Neno district. These facilities include Neno district hospital, Dambe health centre, Chifunga and Matope. Usually, these annual surveys are conducted from January, and the target is to present final report by June. Once data is collected and analyzed, there are a series of reviews that take place before reaching final report. In the first place, the manual process took over 9 months to present final report. Initial findings reported about 76.9% of the data that added up when cross-checked with paper-based sources. Purpose: The aim of this approach is to run away from manually pulling the data, do fresh analysis, and reporting often associated not only with delays in reporting inconsistencies but also with poor quality of data if not done carefully. This automation approach was meant to utilize features of new technologies to create visualizations, reports, and dashboards in Power BI that are directly fished from the data source – CommCare hence only require a single click of a ‘refresh’ button to have the updated information populated in visualizations, reports, and dashboards at once. Methodology: We transformed paper-based questionnaires into electronic using CommCare mobile application. We further connected CommCare Mobile App directly to Power BI using Application Program Interface (API) connection as data pipeline. This provided chance to create visualizations, reports, and dashboards in Power BI. Contrary to the process of manually collecting data in paper-based questionnaires, entering them in ordinary spreadsheets, and conducting analysis every time when preparing for reporting, the team utilized CommCare and Microsoft Power BI technologies. We utilized validations and logics in CommCare to capture data with less errors. We utilized Power BI features to host the reports online by publishing them as cloud-computing process. We switched from sharing ordinary report files to sharing the link to potential recipients hence giving them freedom to dig deep into extra findings within Power BI dashboards and also freedom to export to any formats of their choice. Results: This data automation approach reduced research timelines from the initial 9 months’ duration to 5. It also improved the quality of the data findings from the original 76.9% to 98.9%. This brought confidence to draw conclusions from the findings that help in decision-making and gave opportunities for further researches. Conclusion: These results suggest that automating the research data process has the potential of reducing overall amount of time spent and improving the quality of the data. On this basis, the concept of data automation should be taken into serious consideration when conducting operational research for efficiency and decision-making.Keywords: reporting, decision-making, power BI, commcare, data automation, visualizations, dashboards
Procedia PDF Downloads 11655 Industrial Production of the Saudi Future Dwelling: A Saudi Volumetric Solution for Single Family Homes, Leveraging Industry 4.0 with Scalable Automation, Hybrid Structural Insulated Panels Technology and Local Materials
Authors: Bandar Alkahlan
Abstract:
The King Abdulaziz City for Science and Technology (KACST) created the Saudi Future Dwelling (SFD) initiative to identify, localize and commercialize a scalable home manufacturing technology suited to deployment across the Kingdom of Saudi Arabia (KSA). This paper outlines the journey, the creation of the international project delivery team, the product design, the selection of the process technologies, and the outcomes. A target was set to remove 85% of the construction and finishing processes from the building site as these activities could be more efficiently completed in a factory environment. Therefore, integral to the SFD initiative is the successful industrialization of the home building process using appropriate technologies, automation, robotics, and manufacturing logistics. The technologies proposed for the SFD housing system are designed to be energy efficient, economical, fit for purpose from a Saudi cultural perspective, and will minimize the use of concrete, relying mainly on locally available Saudi natural materials derived from the local resource industries. To this end, the building structure is comprised of a hybrid system of structural insulated panels (SIP), combined with a light gauge steel framework manufactured in a large format panel system. The paper traces the investigative process and steps completed by the project team during the selection process. As part of the SFD Project, a pathway was mapped out to include a proof-of-concept prototype housing module and the set-up and commissioning of a lab-factory complete with all production machinery and equipment necessary to simulate a full-scale production environment. The prototype housing module was used to validate and inform current and future product design as well as manufacturing process decisions. A description of the prototype design and manufacture is outlined along with valuable learning derived from the build and how these results were used to enhance the SFD project. The industrial engineering concepts and lab-factory detailed design and layout are described in the paper, along with the shop floor I.T. management strategy. Special attention was paid to showcase all technologies within the lab-factory as part of the engagement strategy with private investors to leverage the SFD project with large scale factories throughout the Kingdom. A detailed analysis is included in the process surrounding the design, specification, and procurement of the manufacturing machinery, equipment, and logistical manipulators required to produce the SFD housing modules. The manufacturing machinery was comprised of a combination of standardized and bespoke equipment from a wide range of international suppliers. The paper describes the selection process, pre-ordering trials and studies, and, in some cases, the requirement for additional research and development by the equipment suppliers in order to achieve the SFD objectives. A set of conclusions is drawn describing the results achieved thus far, along with a list of recommended ongoing operational tests, enhancements, research, and development aimed at achieving full-scale engagement with private sector investment and roll-out of the SFD project across the Kingdom.Keywords: automation, dwelling, manufacturing, product design
Procedia PDF Downloads 12154 Antibacterial Nanofibrous Film Encapsulated with 4-terpineol/β-cyclodextrin Inclusion Complexes: Relative Humidity-Triggered Release and Shrimp Preservation Application
Authors: Chuanxiang Cheng, Tiantian Min, Jin Yue
Abstract:
Antimicrobial active packaging enables extensive biological effects to improve food safety. However, the efficacy of antimicrobial packaging hinges on factors including the diffusion rate of the active agent toward the food surface, the initial content in the antimicrobial agent, and the targeted food shelf life. Among the possibilities of antimicrobial packaging design, an interesting approach involves the incorporation of volatile antimicrobial agents into the packaging material. In this case, the necessity for direct contact between the active packaging material and the food surface is mitigated, as the antimicrobial agent exerts its action through the packaging headspace atmosphere towards the food surface. However, it still remains difficult to achieve controlled and precise release of bioactive compounds to the specific target location with required quantity in food packaging applications. Remarkably, the development of stimuli-responsive materials for electrospinning has introduced the possibility of achieving controlled release of active agents under specific conditions, thereby yielding enduring biological effects. Relative humidity (RH) for the storage of food categories such as meat and aquatic products typically exceeds 90%. Consequently, high RH can be used as an abiotic trigger for the release of active agents to prevent microbial growth. Hence, a novel RH - responsive polyvinyl alcohol/chitosan (PVA/CS) composite nanofibrous film incorporated with 4-terpineol/β-cyclodextrin inclusion complexes (4-TA@β-CD ICs) was engineered by electrospinning that can be deposited as a functional packaging materials. The characterization results showed the thermal stability of the films was enhanced after the incorporation due to the hydrogen bonds between ICs and polymers. Remarkably, the 4 wt% 4-TA@β-CD ICs/PVA/CS film exhibited enhanced crystallinity, moderate hydrophilic (Water contact angle of 81.53°), light barrier property (Transparency of 1.96%) and water resistance (Water vapor permeability of 3.17 g mm/m2 h kPa). Moreover, this film also showed optimized mechanical performance with a Young’s modulus of 11.33 MPa, a tensile strength of 19.99 MPa and an elongation at break of 4.44 %. Notably, the antioxidant and antibacterial properties of this packaging material were significantly improved. The film demonstrated the half-inhibitory concentrations (IC50) values of 87.74% and 85.11% for scavenging 2,2-diphenyl-1-picrylhydrazyl (DPPH) and 2, 2′-azinobis (3-ethylbenzothiazoline-6-sulfonic) (ABTS) free radicals, respectively, in addition to an inhibition efficiency of 65% against Shewanella putrefaciens, the characteristic bacteria in aquatic products. Most importantly, the film achieved controlled release of 4-TA under high 98% RH by inducing the plasticization of polymers caused by water molecules, swelling of polymer chains, and destruction of hydrogen bonds within the cyclodextrin inclusion complex. Consequently, low relative humidity is suitable for the preservation of nanofibrous film, while high humidity conditions typical in fresh food packaging environments effectively stimulated the release of active compounds in the film. This film with a long-term antimicrobial effect successfully extended the shelf life of Litopenaeus vannamei shrimp to 7 days at 4 °C. This attractive design could pave the way for the development of new food packaging materials.Keywords: controlled release, electrospinning, nanofibrous film, relative humidity–responsive, shrimp preservation
Procedia PDF Downloads 7053 The Underground Ecosystem of Credit Card Frauds
Authors: Abhinav Singh
Abstract:
Point Of Sale (POS) malwares have been stealing the limelight this year. They have been the elemental factor in some of the biggest breaches uncovered in past couple of years. Some of them include • Target: A Retail Giant reported close to 40 million credit card data being stolen • Home Depot : A home product Retailer reported breach of close to 50 million credit records • Kmart: A US retailer recently announced breach of 800 thousand credit card details. Alone in 2014, there have been reports of over 15 major breaches of payment systems around the globe. Memory scrapping malwares infecting the point of sale devices have been the lethal weapon used in these attacks. These malwares are capable of reading the payment information from the payment device memory before they are being encrypted. Later on these malwares send the stolen details to its parent server. These malwares are capable of recording all the critical payment information like the card number, security number, owner etc. All these information are delivered in raw format. This Talk will cover the aspects of what happens after these details have been sent to the malware authors. The entire ecosystem of credit card frauds can be broadly classified into these three steps: • Purchase of raw details and dumps • Converting them to plastic cash/cards • Shop! Shop! Shop! The focus of this talk will be on the above mentioned points and how they form an organized network of cyber-crime. The first step involves buying and selling of the stolen details. The key point to emphasize are : • How is this raw information been sold in the underground market • The buyer and seller anatomy • Building your shopping cart and preferences • The importance of reputation and vouches • Customer support and replace/refunds These are some of the key points that will be discussed. But the story doesn’t end here. As of now the buyer only has the raw card information. How will this raw information be converted to plastic cash? Now comes in picture the second part of this underground economy where-in these raw details are converted into actual cards. There are well organized services running underground that can help you in converting these details into plastic cards. We will discuss about this technique in detail. At last, the final step involves shopping with the stolen cards. The cards generated with the stolen details can be easily used to swipe-and-pay for purchased goods at different retail shops. Usually these purchases are of expensive items that have good resale value. Apart from using the cards at stores, there are underground services that lets you deliver online orders to their dummy addresses. Once the package is received it will be delivered to the original buyer. These services charge based on the value of item that is being delivered. The overall underground ecosystem of credit card fraud works in a bulletproof way and it involves people working in close groups and making heavy profits. This is a brief summary of what I plan to present at the talk. I have done an extensive research and have collected good deal of material to present as samples. Some of them include: • List of underground forums • Credit card dumps • IRC chats among these groups • Personal chat with big card sellers • Inside view of these forum owners. The talk will be concluded by throwing light on how these breaches are being tracked during investigation. How are credit card breaches tracked down and what steps can financial institutions can build an incidence response over it.Keywords: POS mawalre, credit card frauds, enterprise security, underground ecosystem
Procedia PDF Downloads 43952 The Role of Cholesterol Oxidase of Mycobacterium tuberculosis in the Down-Regulation of TLR2-Signaling Pathway in Human Macrophages during Infection Process
Authors: Michal Kielbik, Izabela Szulc-Kielbik, Anna Brzostek, Jaroslaw Dziadek, Magdalena Klink
Abstract:
The goal of many research groups in the world is to find new components that are important for survival of mycobacteria in the host cells. Mycobacterium tuberculosis (Mtb) possesses a number of enzymes degrading cholesterol that are considered to be an important factor for its survival and persistence in host macrophages. One of them - cholesterol oxidase (ChoD), although not being essential for cholesterol degradation, is discussed as a virulence compound, however its involvement in macrophages’ response to Mtb is still not sufficiently determined. The recognition of tubercle bacilli antigens by pathogen recognition receptors is crucial for the initiation of the host innate immune response. An important receptor that has been implicated in the recognition and/or uptake of Mtb is Toll-like receptor type 2 (TLR2). Engagement of TLR2 results in the activation and phosphorylation of intracellular signaling proteins including IRAK-1 and -4, TRAF-6, which in turn leads to the activation of target kinases and transcription factors responsible for bactericidal and pro-inflammatory response of macrophages. The aim of these studies was a detailed clarification of the role of Mtb cholesterol oxidase as a virulence factor affecting the TLR2 signaling pathway in human macrophages. As human macrophages the THP-1 differentiated cells were applied. The virulent wild-type Mtb strain (H37Rv), its mutant lacking a functional copy of gene encoding cholesterol oxidase (∆choD), as well as complimented strain (∆choD–choD) were used. We tested the impact of Mtb strains on the expression of TLR2-depended signaling proteins (mRNA level, cytosolic level and phosphorylation status). The cytokine and bactericidal response of THP-1 derived macrophages infected with Mtb strains in relation to TLR2 signaling pathway dependence was also determined. We found that during the 24-hours of infection process the wild-type and complemented Mtb significantly reduced the cytosolic level and phosphorylation status of IRAK-4 and TRAF-6 proteins in macrophages, that was not observed in the case of ΔchoD mutant. Decreasement of TLR2-dependent signaling proteins, induced by wild-type Mtb, was not dependent on the activity of proteasome. Blocking of TLR2 expression, before infection, effectively prevented the induced by wild-type strain reduction of cytosolic level and phosphorylation of IRAK-4. None of the strains affected the surface expression of TLR2. The mRNA level of IRAK-4 and TRAF-6 genes were significantly increased in macrophages 24 hours post-infection with either of tested strains. However, the impact of wild-type Mtb strain on both examined genes was significantly stronger than its ΔchoD mutant. We also found that wild-type strain stimulated macrophages to release high amount of immunosuppressive IL-10, accompanied by low amount of pro-inflammatory IL-8 and bactericidal nitric oxide in comparison to mutant lacking cholesterol oxidase. The influence of wild-type Mtb on this type of macrophages' response strongly dependent on fully active IRAK-1 and IRAK-4 signaling proteins. In conclusion, Mtb using cholesterol oxidase causes the over-activation of TLR2 signaling proteins leading to the reduction of their cytosolic level and activity resulting in the modulation of macrophages response to allow its intracellular survival. Supported by grant: 2014/15/B/NZ6/01565, National Science Center, PolandKeywords: Mycobacterium tuberculosis, cholesterol oxidase, macrophages, TLR2-dependent signaling pathway
Procedia PDF Downloads 419