Search results for: malware labeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 196

Search results for: malware labeling

16 Measuring Biobased Content of Building Materials Using Carbon-14 Testing

Authors: Haley Gershon

Abstract:

The transition from using fossil fuel-based building material to formulating eco-friendly and biobased building materials plays a key role in sustainable building. The growing demand on a global level for biobased materials in the building and construction industries heightens the importance of carbon-14 testing, an analytical method used to determine the percentage of biobased content that comprises a material’s ingredients. This presentation will focus on the use of carbon-14 analysis within the building materials sector. Carbon-14, also known as radiocarbon, is a weakly radioactive isotope present in all living organisms. Any fossil material older than 50,000 years will not contain any carbon-14 content. The radiocarbon method is thus used to determine the amount of carbon-14 content present in a given sample. Carbon-14 testing is performed according to ASTM D6866, a standard test method developed specifically for biobased content determination of material in solid, liquid, or gaseous form, which requires radiocarbon dating. Samples are combusted and converted into a solid graphite form and then pressed onto a metal disc and mounted onto a wheel of an accelerator mass spectrometer (AMS) machine for the analysis. The AMS instrument is used in order to count the amount of carbon-14 present. By submitting samples for carbon-14 analysis, manufacturers of building materials can confirm the biobased content of ingredients used. Biobased testing through carbon-14 analysis reports results as percent biobased content, indicating the percentage of ingredients coming from biomass sourced carbon versus fossil carbon. The analysis is performed according to standardized methods such as ASTM D6866, ISO 16620, and EN 16640. Products 100% sourced from plants, animals, or microbiological material are therefore 100% biobased, while products sourced only from fossil fuel material are 0% biobased. Any result in between 0% and 100% biobased indicates that there is a mixture of both biomass-derived and fossil fuel-derived sources. Furthermore, biobased testing for building materials allows manufacturers to submit eligible material for certification and eco-label programs such as the United States Department of Agriculture (USDA) BioPreferred Program. This program includes a voluntary labeling initiative for biobased products, in which companies may apply to receive and display the USDA Certified Biobased Product label, stating third-party verification and displaying a product’s percentage of biobased content. The USDA program includes a specific category for Building Materials. In order to qualify for the biobased certification under this product category, examples of product criteria that must be met include minimum 62% biobased content for wall coverings, minimum 25% biobased content for lumber, and a minimum 91% biobased content for floor coverings (non-carpet). As a result, consumers can easily identify plant-based products in the marketplace.

Keywords: carbon-14 testing, biobased, biobased content, radiocarbon dating, accelerator mass spectrometry, AMS, materials

Procedia PDF Downloads 133
15 Intracommunity Attitudes Toward the Gatekeeping of Asexuality in the LGBTQ+ Community on Tumblr

Authors: A.D. Fredline, Beverly Stiles

Abstract:

This is a qualitative investigation that examines the social media site, Tumblr, for the goal of analyzing the controversy regarding the inclusion of asexuality in the LGBTQ+ community. As platforms such as Tumblr permit the development of communities for marginalized groups, social media serves as a core component to exclusionary practices and boundary negotiations for community membership. This research is important because there is a paucity of research on the topic and a significant gap in the literature with regards to intracommunity gatekeeping. However, discourse on the topic is blatantly apparent on social media platforms. The objectives are to begin to bridge the gap in the literature by examining attitudes towards the inclusion of asexuality within the LGBTQ+ community. In order to analyze the attitudes developed towards the inclusion of asexuality in the LGBTQ+ community, eight publicly available blogs on Tumblr.com were selected from both the “inclusionist” and “exclusionist” perspectives. Blogs selected were found through a basic search for “inclusionist” and “exclusionist” on the Tumblr website. Out of the first twenty blogs listed for each set of results, those centrally focused on asexuality discourse were selected. For each blog, the fifty most recent postings were collected. Analysis of the collected postings exposed three central themes from the exclusionist perspective as well as for the inclusionist perspective. Findings indicate that from the inclusionist perspective, asexuality belongs to the LGBTQ+ community. One primary argument from this perspective is that asexual individuals face opposition for their identity just as do other identities included in the community. This opposition is said to take a variety of forms, such as verbal shaming, assumption of illness and corrective rape. Another argument is that the LGBTQ+ community and asexuals face a common opponent in cisheterosexism as asexuals struggle with the assumed and expected sexualization. A final central theme is that denying asexual inclusion leads to the assumption of heteronormativity. Findings also indicate that from the exclusionist perspective, asexuality does not belong to the LGBTQ+ community. One central theme from this perspective is the equivalization of cisgender heteroromantic asexuals with cisgender heterosexuals. As straight individuals are not allowed in the community, exclusionists argue that asexuals engaged in opposite gender partnerships should not be included. Another debate is that including asexuality in the community sexualizes all other identities by assuming sexual orientation is inherently sexual rather than romantic. Finally, exclusionists also argue that asexuality encourages childhood labeling and forces sexual identities on children, something not promoted by the LGBTQ+ community. Conclusions drawn from analyzing both perspectives is that integration may be a possibility, but complexities add another layer of discourse. For example, both inclusionists and exclusionists agree that privileged identities do not belong to the LGBTQ+ community. The focus of discourse is whether or not asexuals are privileged. Clearly, both sides of the debate have the same vision of what binds the community together. The question that remains is who belongs to that community.

Keywords: asexuality, exclusionists, inclusionists, Tumblr

Procedia PDF Downloads 158
14 Contextual Toxicity Detection with Data Augmentation

Authors: Julia Ive, Lucia Specia

Abstract:

Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.

Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing

Procedia PDF Downloads 138
13 Enhancing Food Quality and Safety Management in Ethiopia's Food Processing Industry: Challenges, Causes, and Solutions

Authors: Tuji Jemal Ahmed

Abstract:

Food quality and safety challenges are prevalent in Ethiopia's food processing industry, which can have adverse effects on consumers' health and wellbeing. The country is known for its diverse range of agricultural products, which are essential to its economy. However, poor food quality and safety policies and management systems in the food processing industry have led to several health problems, foodborne illnesses, and economic losses. This paper aims to highlight the causes and effects of food safety and quality issues in the food processing industry of Ethiopia and discuss potential solutions to address these issues. One of the main causes of poor food quality and safety in Ethiopia's food processing industry is the lack of adequate regulations and enforcement mechanisms. The absence of comprehensive food safety and quality policies and guidelines has led to substandard practices in the food manufacturing process. Moreover, the lack of monitoring and enforcement of existing regulations has created a conducive environment for unscrupulous businesses to engage in unsafe practices that endanger the public's health. The effects of poor food quality and safety are significant, ranging from the loss of human lives, increased healthcare costs, and loss of consumer confidence in the food processing industry. Foodborne illnesses, such as diarrhea, typhoid fever, and cholera, are prevalent in Ethiopia, and poor food quality and safety practices contribute significantly to their prevalence. Additionally, food recalls due to contamination or mislabeling often result in significant economic losses for businesses in the food processing industry. To address these challenges, the Ethiopian government has begun to take steps to improve food quality and safety in the food processing industry. One of the most notable initiatives is the Ethiopian Food and Drug Administration (EFDA), which was established in 2010 to regulate and monitor the quality and safety of food and drug products in the country. The EFDA has implemented several measures to enhance food safety, such as conducting routine inspections, monitoring the importation of food products, and enforcing strict labeling requirements. Another potential solution to improve food quality and safety in Ethiopia's food processing industry is the implementation of food safety management systems (FSMS). An FSMS is a set of procedures and policies designed to identify, assess, and control food safety hazards throughout the food manufacturing process. Implementing an FSMS can help businesses in the food processing industry identify and address potential hazards before they cause harm to consumers. Additionally, the implementation of an FSMS can help businesses comply with existing food safety regulations and guidelines. In conclusion, improving food quality and safety policies and management systems in Ethiopia's food processing industry is critical to protecting public health and enhancing the country's economy. Addressing the root causes of poor food quality and safety and implementing effective solutions, such as the establishment of regulatory agencies and the implementation of food safety management systems, can help to improve the overall safety and quality of the country's food supply.

Keywords: food quality, food safety, policy, management system, food processing industry

Procedia PDF Downloads 51
12 Walking in a Web of Animality: An Animality Informed Ethnography for an Inclusive Coexistence With (Other) Animals

Authors: Francesco De Giorgio

Abstract:

As different groups of wild animals are moving from natural to more anthropic environments, the need to overcome the human-animal gap for ethical coexistence becomes a public concern. Ethnology and ethnography play fundamental roles in the understanding of dynamics, perspective and movement in our interaction with (other) animals. In this effort, the Animality perspective provides an essential ethical lens and quality guidance for ethnography. It deconstructs the human/animal distinction and creates an inclusive approach to society. It further transgresses the rigid lines of normalizing images in human cultures, in which individuals are easily marginalized as ‘different’. Just like labeling an animal with species-specific behavior, judging and categorizing humans according to culture-specific expectations is easier than recognizing subjectivity. A fusion of anti-speciesist ethnology and ethnography of natural and social sciences can redress the shortcomings of current practices of multispecies ethnography that largely remain within an exclusively normalized human perspective. Empirically, the paper is based on current research on wild urban animals and human movement in Genua (IT), collecting data from systematic observations in the field regarding wild boars and ethnographic data collection over a period of time (18 months) where the human involved are educated in a changing perspective of coexistence. An “animality-ethnography” starts from observing our animal movement, how much and when we move, how we intersect our movement with that of other animals cohabiting with us, how we can observe and know others by moving, and ways of walking. The research will show how (interspecies) socio-cognition implies motion and movement and animal journeys between nature and the city, but also within the cities themselves, where a web of motion becomes the basic cultural matrix for cohabiting spaces, places, and systems. Here, the term "cognition" does not refer just to the brain or mind or intelligence. Indeed, cognition has a lot to do with movement, space, motion, proprioception, and the body. The ability to be informed, not only through what you see but also through the information you get from being in tune with the motion of a shared dynamic. To be an informative presence instead of an active stimulus or passive expectation, where the latter leaves too much space for projections and interpretations. What is proposed here is an understanding of our own animal movement linked to our own animal cognition. The result of breaking down your own culturally prescribed way in ethnographic research is breaking the barrier of limited options for observation and comprehension of the Other. Walking in the same way results in seeing others in the same way, studying them through only one channel of perception, causing a one-dimensional life instead of a multidimensional web. Returning to an understanding of our Animality, our animal movement, being in tune to improve a socio-cognitive context of cohabitation, both with domestic and wild animals, both in a forest or in a metropolis, represents the challenge of the coming years, and the evolution of the next centuries, to both preserve and share cultures, beyond the boundaries of species.

Keywords: antispeciesist ethology, interspecies coexistence, socio-cognition, intersectionality, animality

Procedia PDF Downloads 39
11 A Foucauldian Analysis of Child Play: Case Study of a Preschool in the United States

Authors: Meng Wang

Abstract:

Historically, young members (children) in the society have been oppressed by adults through direct violent acts. Direct violence was evident in rampant child labor and child maltreatment cases. After acknowledging the rights of children from the United Nations, it is believed in public that children have been protected against direct physical violence. Nevertheless, at present, this paper argues from Foucauldian and disability study standpoints that similar to the old times, children are oppressed objects in the context of child play, which is constructed by adults to substitute direct violence in regulating children. Particularly, this paper suggests that on the one hand, preschool play is a new way that adults adopt to oppress preschoolers and regulate the society as a whole; on the other hand, preschoolers are taught how to play as an acquired skill and master self-regulation through play. There is a line of contemporary research that centers on child play from social constructivism perspective. Yet, current teaching practices pertaining to child play including guided child play and free play, in fact, serve the interest of adults and society at large. By acknowledging and deconstructing the prevalence of 'evidence-based best practice' in early childhood education field within western society, reconstruction of child-adult power relation could be achieved and alternative truth could be found in early childhood education. To support the argument of this paper, an on-going observational case study is conducted in a preschool setting in the United States. Age range of children is 2.5 to 4 years old. Approximately 10 children (5 boys) are participating in this case study. Observation is conducted throughout the weekdays as children follow through the classroom routine with a lead and an assistant teacher. Classroom teachers are interviewed pertaining to their classroom management strategies. Preliminary research finding of this case study suggested that preschool teachers tended to utilize scenarios from preschoolers’ dramatic play to impart core cultural values to young children. These values were pre-determined by adults. In addition, if young children have failed to follow teachers' guidance in terms of playing in a correct way, children ran the risk of being excluded from the play scenario by peers and adults. Furthermore, this study tended to indicate that through child play, preschoolers are obliged to develop an internal violence system, that is self-regulation skill to regulate their own behavior; and if this internal system is unestablished based on various assessments by adults, then potentially there will be consequences of negative labeling and disabling toward young children intended by adults. In conclusion, this paper applies Foucauldian analysis into the context of child play. At present, within preschool, child play is not free as it seems to be. Young children are expected to perform cultural tasks through their play activities designed by adults. Adults utilize child play as technologies of governmentality to further predict and regulate future society at large.

Keywords: child play, developmentally appropriate practice, DAP, poststructuralism, technologies of governmentality

Procedia PDF Downloads 129
10 Agri-Food Transparency and Traceability: A Marketing Tool to Satisfy Consumer Awareness Needs

Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli

Abstract:

The link between man and food plays, in the social and economic system, a central role where cultural and multidisciplinary aspects intertwine: food is not only nutrition, but also communication, culture, politics, environment, science, ethics, fashion. This multi-dimensionality has many implications in the food economy. In recent years, the consumer became more conscious about his food choices, involving a consistent change in consumption models. This change concerns several aspects: awareness of food system issues, employment of socially and environmentally conscious decision-making, food choices based on different characteristics than nutritional ones i.e. origin of food, how it’s produced, and who’s producing it. In this frame the ‘consumption choices’ and the ‘interests of the citizen’ become one part of the others. The figure of the ‘Citizen Consumer’ is born, a responsible and ethically motivated individual to change his lifestyle, achieving the goal of sustainable consumption. Simultaneously the branding, that before was guarantee of the product quality, today is questioned. In order to meet these needs, Agri-Food companies are developing specific product lines that follow two main philosophies: ‘Back to basics’ and ‘Less is more’. However, the issue of ethical behavior does not seem to find an adequate on market offer. Most likely due to a lack of attention on the communication strategy used, very often based on market logic and rarely on ethical one. The label in its classic concept of ‘clean labeling’ can no longer be the only instrument through which to convey product information and its evolution towards a concept of ‘clear label’ is necessary to embrace ethical and transparent concepts in progress the process of democratization of the Food System. The implementation of a voluntary traceability path, relying on the technological models of the Internet of Things or Industry 4.0, would enable the Agri-Food Supply Chain to collect data that, if properly treated, could satisfy the information need of consumers. A change of approach is therefore proposed towards Agri-Food traceability that is no longer intended as a tool to be used to respond to the legislator, but rather as a promotional tool useful to tell the company in a transparent manner and then reach the slice of the market of food citizens. The use of mobile technology can also facilitate this information transfer. However, in order to guarantee maximum efficiency, an appropriate communication model based on the ethical communication principles should be used, which aims to overcome the pipeline communication model, to offer the listener a new way of telling the food product, based on real data collected through processes traceability. The Citizen Consumer is therefore placed at the center of the new model of communication in which he has the opportunity to choose what to know and how. The new label creates a virtual access point capable of telling the product according to different point of views, following the personal interests and offering the possibility to give several content modalities to support different situations and usability.

Keywords: agri food traceability, agri-food transparency, clear label, food system, internet of things

Procedia PDF Downloads 129
9 Vehicle Timing Motion Detection Based on Multi-Dimensional Dynamic Detection Network

Authors: Jia Li, Xing Wei, Yuchen Hong, Yang Lu

Abstract:

Detecting vehicle behavior has always been the focus of intelligent transportation, but with the explosive growth of the number of vehicles and the complexity of the road environment, the vehicle behavior videos captured by traditional surveillance have been unable to satisfy the study of vehicle behavior. The traditional method of manually labeling vehicle behavior is too time-consuming and labor-intensive, but the existing object detection and tracking algorithms have poor practicability and low behavioral location detection rate. This paper proposes a vehicle behavior detection algorithm based on the dual-stream convolution network and the multi-dimensional video dynamic detection network. In the videos, the straight-line behavior of the vehicle will default to the background behavior. The Changing lanes, turning and turning around are set as target behaviors. The purpose of this model is to automatically mark the target behavior of the vehicle from the untrimmed videos. First, the target behavior proposals in the long video are extracted through the dual-stream convolution network. The model uses a dual-stream convolutional network to generate a one-dimensional action score waveform, and then extract segments with scores above a given threshold M into preliminary vehicle behavior proposals. Second, the preliminary proposals are pruned and identified using the multi-dimensional video dynamic detection network. Referring to the hierarchical reinforcement learning, the multi-dimensional network includes a Timer module and a Spacer module, where the Timer module mines time information in the video stream and the Spacer module extracts spatial information in the video frame. The Timer and Spacer module are implemented by Long Short-Term Memory (LSTM) and start from an all-zero hidden state. The Timer module uses the Transformer mechanism to extract timing information from the video stream and extract features by linear mapping and other methods. Finally, the model fuses time information and spatial information and obtains the location and category of the behavior through the softmax layer. This paper uses recall and precision to measure the performance of the model. Extensive experiments show that based on the dataset of this paper, the proposed model has obvious advantages compared with the existing state-of-the-art behavior detection algorithms. When the Time Intersection over Union (TIoU) threshold is 0.5, the Average-Precision (MP) reaches 36.3% (the MP of baselines is 21.5%). In summary, this paper proposes a vehicle behavior detection model based on multi-dimensional dynamic detection network. This paper introduces spatial information and temporal information to extract vehicle behaviors in long videos. Experiments show that the proposed algorithm is advanced and accurate in-vehicle timing behavior detection. In the future, the focus will be on simultaneously detecting the timing behavior of multiple vehicles in complex traffic scenes (such as a busy street) while ensuring accuracy.

Keywords: vehicle behavior detection, convolutional neural network, long short-term memory, deep learning

Procedia PDF Downloads 95
8 Investigation of Linezolid, 127I-Linezolid and 131I-Linezolid Effects on Slime Layer of Staphylococcus with Nuclear Methods

Authors: Hasan Demiroğlu, Uğur Avcıbaşı, Serhan Sakarya, Perihan Ünak

Abstract:

Implanted devices are progressively practiced in innovative medicine to relieve pain or improve a compromised function. Implant-associated infections represent an emerging complication, caused by organisms which adhere to the implant surface and grow embedded in a protective extracellular polymeric matrix, known as a biofilm. In addition, the microorganisms within biofilms enter a stationary growth phase and become phenotypically resistant to most antimicrobials, frequently causing treatment failure. In such cases, surgical removal of the implant is often required, causing high morbidity and substantial healthcare costs. Staphylococcus aureus is the most common pathogen causing implant-associated infections. Successful treatment of these infections includes early surgical intervention and antimicrobial treatment with bactericidal drugs that also act on the surface-adhering microorganisms. Linezolid is a promising anti-microbial with ant-staphylococcal activity, used for the treatment of MRSA infections. Linezolid is a synthetic antimicrobial and member of oxazolidinoni group, with a bacteriostatic or bactericidal dose-dependent antimicrobial mechanism against gram-positive bacteria. Intensive use of antibiotics, have emerged multi-resistant organisms over the years and major problems have begun to be experienced in the treatment of infections occurred with them. While new drugs have been developed worldwide, on the other hand infections formed with microorganisms which gained resistance against these drugs were reported and the scale of the problem increases gradually. Scientific studies about the production of bacterial biofilm increased in recent years. For this purpose, we investigated the activity of Lin, Lin radiolabeled with 131I (131I-Lin) and cold iodinated Lin (127I-Lin) against clinical strains of Staphylococcus aureus DSM 4910 in biofilm. In the first stage, radio and cold labeling studies were performed. Quality-control studies of Lin and iodo (radio and cold) Lin derivatives were carried out by using TLC (Thin Layer Radiochromatography) and HPLC (High Pressure Liquid Chromatography). In this context, it was found that the binding yield was obtained to be about 86±2 % for 131I-Lin. The minimal inhibitory concentration (MIC) of Lin, 127I-Lin and 131I-Lin for Staphylococcus aureus DSM 4910 strain were found to be 1µg/mL. In time-kill studies of Lin, 127I-Lin and 131I-Lin were producing ≥ 3 log10 decreases in viable counts (cfu/ml) within 6 h at 2 and 4 fold of MIC respectively. No viable bacteria were observed within the 24 h of the experiments. Biofilm eradication of S. aureus started with 64 µg/mL of Lin, 127I-Lin and 131I-Lin, and OD630 was 0.507±0.0.092, 0.589±0.058 and 0.266±0.047, respectively. The media control of biofilm producing Staphylococcus was 1.675±0,01 (OD630). 131I and 127I did not have any effects on biofilms. Lin and 127I-Lin were found less effectively than 131I-Lin at killing cells in biofilm and biofilm eradication. Our results demonstrate that the 131I-Lin have potent anti-biofilm activity against S. aureus compare to Lin, 127I-Lin and media control. This is suggested that, 131I may have harmful effect on biofilm structure.

Keywords: iodine-131, linezolid, radiolabeling, slime layer, Staphylococcus

Procedia PDF Downloads 536
7 Predictive Maintenance: Machine Condition Real-Time Monitoring and Failure Prediction

Authors: Yan Zhang

Abstract:

Predictive maintenance is a technique to predict when an in-service machine will fail so that maintenance can be planned in advance. Analytics-driven predictive maintenance is gaining increasing attention in many industries such as manufacturing, utilities, aerospace, etc., along with the emerging demand of Internet of Things (IoT) applications and the maturity of technologies that support Big Data storage and processing. This study aims to build an end-to-end analytics solution that includes both real-time machine condition monitoring and machine learning based predictive analytics capabilities. The goal is to showcase a general predictive maintenance solution architecture, which suggests how the data generated from field machines can be collected, transmitted, stored, and analyzed. We use a publicly available aircraft engine run-to-failure dataset to illustrate the streaming analytics component and the batch failure prediction component. We outline the contributions of this study from four aspects. First, we compare the predictive maintenance problems from the view of the traditional reliability centered maintenance field, and from the view of the IoT applications. When evolving to the IoT era, predictive maintenance has shifted its focus from ensuring reliable machine operations to improve production/maintenance efficiency via any maintenance related tasks. It covers a variety of topics, including but not limited to: failure prediction, fault forecasting, failure detection and diagnosis, and recommendation of maintenance actions after failure. Second, we review the state-of-art technologies that enable a machine/device to transmit data all the way through the Cloud for storage and advanced analytics. These technologies vary drastically mainly based on the power source and functionality of the devices. For example, a consumer machine such as an elevator uses completely different data transmission protocols comparing to the sensor units in an environmental sensor network. The former may transfer data into the Cloud via WiFi directly. The latter usually uses radio communication inherent the network, and the data is stored in a staging data node before it can be transmitted into the Cloud when necessary. Third, we illustrate show to formulate a machine learning problem to predict machine fault/failures. By showing a step-by-step process of data labeling, feature engineering, model construction and evaluation, we share following experiences: (1) what are the specific data quality issues that have crucial impact on predictive maintenance use cases; (2) how to train and evaluate a model when training data contains inter-dependent records. Four, we review the tools available to build such a data pipeline that digests the data and produce insights. We show the tools we use including data injection, streaming data processing, machine learning model training, and the tool that coordinates/schedules different jobs. In addition, we show the visualization tool that creates rich data visualizations for both real-time insights and prediction results. To conclude, there are two key takeaways from this study. (1) It summarizes the landscape and challenges of predictive maintenance applications. (2) It takes an example in aerospace with publicly available data to illustrate each component in the proposed data pipeline and showcases how the solution can be deployed as a live demo.

Keywords: Internet of Things, machine learning, predictive maintenance, streaming data

Procedia PDF Downloads 358
6 Investigating Role of Autophagy in Cispaltin Induced Stemness and Chemoresistance in Oral Squamous Cell Carcinoma

Authors: Prajna Paramita Naik, Sujit Kumar Bhutia

Abstract:

Background: Regardless of the development multimodal treatment strategies, oral squamous cell carcinoma (OSCC) is often associated with a high rate of recurrence, metastasis and chemo- and radio- resistance. The present study inspected the relevance of CD44, ABCB1 and ADAM17 expression as a putative stem cell compartment in oral squamous cell carcinoma (OSCC) and deciphered the role of autophagy in regulating the expression of aforementioned proteins, stemness and chemoresistance. Methods: A retrospective analysis of CD44, ABCB1 and ADAM17 expression with respect to the various clinicopathological factors of sixty OSCC patients were determined via immunohistochemistry. The correlation among CD44, ABCB1 and ADAM17 expression was established. Sphere formation assay, flow cytometry and fluorescence microscopy were conducted to elucidate the stemness and chemoresistance nature of established cisplatin-resistant oral cancer cells (FaDu). The pattern of expression of CD44, ABCB1 and ADAM17 in parental (FaDu-P) and resistant FaDu cells (FaDu-CDDP-R) were investigated through fluorescence microscopy. Western blot analysis of autophagy marker proteins was performed to compare the status of autophagy in parental and resistant FaDu cell. To investigate the role of autophagy in chemoresistance and stemness, sphere formation assay, immunofluorescence and Western blot analysis was performed post transfection with siATG14 and the level of expression of autophagic proteins, mitochondrial protein and stemness-associated proteins were analyzed. The statistical analysis was performed by GraphPad Prism 4.0 software. p-value was defined as follows: not significant (n.s.): p > 0.05;*: p ≤ 0.05; **: p ≤ 0.01; ***: p ≤ 0.001; ****: p ≤ 0.0001 were considered statistically significant. Results: In OSCC, high CD44, ABCB1 and ADAM17 expression were significantly correlated with higher tumor grades and poor differentiation. However, the expression of these proteins was not related to the age and sex of OSCC patients. Moreover, the expression of CD44, ABCB1 and ADAM17 were positively correlated with each other. In vitro and OSCC tissue double labeling experiment data showed that CD44+ cells were highly associated with ABCB1 and ADAM17 expression. Further, FaDu-CDDP-R cells showed higher sphere forming capacity along with increased fraction of the CD44+ population and β-catenin expression FaDu-CDDP-R cells also showed accelerated expression of CD44, ABCB1 and ADAM17. A comparatively higher autophagic flux was observed in FaDu-CDDP-R against FaDu-P cells. The expression of mitochondrial proteins was noticeably reduced in resistant cells as compared to parental cells indicating the occurrence of autophagy-mediated mitochondrial degradation in oral cancer. Moreover, inhibition of autophagy was coupled with the decreased formation of orospheres suggesting autophagy-mediated stemness in oral cancer. Blockade of autophagy was also found to induce the restoration of mitochondrial proteins in FaDu-CDDP-R cells indicating the involvement of mitophagy in chemoresistance. Furthermore, a reduced expression of CD44, ABCB1 and ADAM17 was also observed in ATG14 deficient cells FaDu-P and FaDu-CDDP-R cells. Conclusion: The CD44+ ⁄ABCB1+ ⁄ADAM17+ expression in OSCC might be associated with chemoresistance and a putative CSC compartment. Further, the present study highlights the contribution of mitophagy in chemoresistance and confirms the potential involvement of autophagic regulation in acquisition of stem-like characteristics in OSCC.

Keywords: ABCB1, ADAM17, autophagy, CD44, chemoresistance, mitophagy, OSCC, stemness

Procedia PDF Downloads 173
5 A Single Cell Omics Experiments as Tool for Benchmarking Bioinformatics Oncology Data Analysis Tools

Authors: Maddalena Arigoni, Maria Luisa Ratto, Raffaele A. Calogero, Luca Alessandri

Abstract:

The presence of tumor heterogeneity, where distinct cancer cells exhibit diverse morphological and phenotypic profiles, including gene expression, metabolism, and proliferation, poses challenges for molecular prognostic markers and patient classification for targeted therapies. Understanding the causes and progression of cancer requires research efforts aimed at characterizing heterogeneity, which can be facilitated by evolving single-cell sequencing technologies. However, analyzing single-cell data necessitates computational methods that often lack objective validation. Therefore, the establishment of benchmarking datasets is necessary to provide a controlled environment for validating bioinformatics tools in the field of single-cell oncology. Benchmarking bioinformatics tools for single-cell experiments can be costly due to the high expense involved. Therefore, datasets used for benchmarking are typically sourced from publicly available experiments, which often lack a comprehensive cell annotation. This limitation can affect the accuracy and effectiveness of such experiments as benchmarking tools. To address this issue, we introduce omics benchmark experiments designed to evaluate bioinformatics tools to depict the heterogeneity in single-cell tumor experiments. We conducted single-cell RNA sequencing on six lung cancer tumor cell lines that display resistant clones upon treatment of EGFR mutated tumors and are characterized by driver genes, namely ROS1, ALK, HER2, MET, KRAS, and BRAF. These driver genes are associated with downstream networks controlled by EGFR mutations, such as JAK-STAT, PI3K-AKT-mTOR, and MEK-ERK. The experiment also featured an EGFR-mutated cell line. Using 10XGenomics platform with cellplex technology, we analyzed the seven cell lines together with a pseudo-immunological microenvironment consisting of PBMC cells labeled with the Biolegend TotalSeq™-B Human Universal Cocktail (CITEseq). This technology allowed for independent labeling of each cell line and single-cell analysis of the pooled seven cell lines and the pseudo-microenvironment. The data generated from the aforementioned experiments are available as part of an online tool, which allows users to define cell heterogeneity and generates count tables as an output. The tool provides the cell line derivation for each cell and cell annotations for the pseudo-microenvironment based on CITEseq data by an experienced immunologist. Additionally, we created a range of pseudo-tumor tissues using different ratios of the aforementioned cells embedded in matrigel. These tissues were analyzed using 10XGenomics (FFPE samples) and Curio Bioscience (fresh frozen samples) platforms for spatial transcriptomics, further expanding the scope of our benchmark experiments. The benchmark experiments we conducted provide a unique opportunity to evaluate the performance of bioinformatics tools for detecting and characterizing tumor heterogeneity at the single-cell level. Overall, our experiments provide a controlled and standardized environment for assessing the accuracy and robustness of bioinformatics tools for studying tumor heterogeneity at the single-cell level, which can ultimately lead to more precise and effective cancer diagnosis and treatment.

Keywords: single cell omics, benchmark, spatial transcriptomics, CITEseq

Procedia PDF Downloads 73
4 Machine Learning Approach for Automating Electronic Component Error Classification and Detection

Authors: Monica Racha, Siva Chandrasekaran, Alex Stojcevski

Abstract:

The engineering programs focus on promoting students' personal and professional development by ensuring that students acquire technical and professional competencies during four-year studies. The traditional engineering laboratory provides an opportunity for students to "practice by doing," and laboratory facilities aid them in obtaining insight and understanding of their discipline. Due to rapid technological advancements and the current COVID-19 outbreak, the traditional labs were transforming into virtual learning environments. Aim: To better understand the limitations of the physical laboratory, this research study aims to use a Machine Learning (ML) algorithm that interfaces with the Augmented Reality HoloLens and predicts the image behavior to classify and detect the electronic components. The automated electronic components error classification and detection automatically detect and classify the position of all components on a breadboard by using the ML algorithm. This research will assist first-year undergraduate engineering students in conducting laboratory practices without any supervision. With the help of HoloLens, and ML algorithm, students will reduce component placement error on a breadboard and increase the efficiency of simple laboratory practices virtually. Method: The images of breadboards, resistors, capacitors, transistors, and other electrical components will be collected using HoloLens 2 and stored in a database. The collected image dataset will then be used for training a machine learning model. The raw images will be cleaned, processed, and labeled to facilitate further analysis of components error classification and detection. For instance, when students conduct laboratory experiments, the HoloLens captures images of students placing different components on a breadboard. The images are forwarded to the server for detection in the background. A hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm will be used to train the dataset for object recognition and classification. The convolution layer extracts image features, which are then classified using Support Vector Machine (SVM). By adequately labeling the training data and classifying, the model will predict, categorize, and assess students in placing components correctly. As a result, the data acquired through HoloLens includes images of students assembling electronic components. It constantly checks to see if students appropriately position components in the breadboard and connect the components to function. When students misplace any components, the HoloLens predicts the error before the user places the components in the incorrect proportion and fosters students to correct their mistakes. This hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm automating electronic component error classification and detection approach eliminates component connection problems and minimizes the risk of component damage. Conclusion: These augmented reality smart glasses powered by machine learning provide a wide range of benefits to supervisors, professionals, and students. It helps customize the learning experience, which is particularly beneficial in large classes with limited time. It determines the accuracy with which machine learning algorithms can forecast whether students are making the correct decisions and completing their laboratory tasks.

Keywords: augmented reality, machine learning, object recognition, virtual laboratories

Procedia PDF Downloads 111
3 A Regulator's Assessment of Consumer Risk When Evaluating a User Test for an Umbrella Brand Name in an over the Counter Medicine

Authors: A. Bhatt, C. Bassi, H. Farragher, J. Musk

Abstract:

Background: All medicines placed on the EU market are legally required to be accompanied by labeling and package leaflet, which provide comprehensive information, enabling its safe and appropriate use. Mock-ups with results of assessments using a target patient group must be submitted for a marketing authorisation application. Consumers need confidence in non-prescription, OTC medicines in order to manage their minor ailments and umbrella brands assist purchasing decisions by assisting easy identification within a particular therapeutic area. A number of regulatory agencies have risk management tools and guidelines to assist in developing umbrella brands for OTC medicines, however assessment and decision making is subjective and inconsistent. This study presents an evaluation in the UK following the US FDA warning concerning methaemoglobinaemia following 21 reported cases (11 children under 2 years) caused by OTC oral analgesics containing benzocaine. METHODS: A standard face to face, 25 structured task based user interview testing methodology using a standard questionnaire and rating scale in consumers aged 15-91 years, was conducted independently between June and October 2015 in their homes. Whether individuals could discriminate between the labelling, safety information and warnings on cartons and PILs between 3 different OTC medicines packs with the same umbrella name was evaluated. Each pack was presented with differing information hierarchy using, different coloured cartons, containing the 3 different active ingredients, benzocaine (oromucosal spray) and two lozenges containing 2, 4, dichlorobenzyl alcohol, amylmetacresol and hexylresorcinol respectively (for the symptomatic relief of sore throat pain). The test was designed to determine whether warnings on the carton and leaflet were prominent, accessible to alert users that one product contained benzocaine, risk of methaemoglobinaemia, and refer to the leaflet for the signs of the condition and what to do should this occur. Results: Two consumers did not locate the warnings on the side of the pack, eventually found them on the back and two suggestions to further improve accessibility of the methaemoglobinaemia warning. Using a gold pack design for the oromucosal spray, all consumers could differentiate between the 3 drugs, minimum age particulars, pharmaceutical form and the risk factor methaemoglobinaemia. The warnings for benzocaine were deemed to be clear or very clear; appearance of the 3 packs were either very well differentiated or quite well differentiated. The PIL test passed on all criteria. All consumers could use the product correctly, identify risk factors ensuring the critical information necessary for the safe use was legible and easily accessible so that confusion and errors were minimised. Conclusion: Patients with known methaemoglobinaemia are likely to be vigilant in checking for benzocaine containing products, despite similar umbrella brand names across a range of active ingredients. Despite these findings, the package design and spray format were not deemed to be sufficient to mitigate potential safety risks associated with differences in target populations and contraindications when submitted to the Regulatory Agency. Although risk management tools are increasingly being used by agencies to assist in providing objective assurance of package safety, further transparency, reduction in subjectivity and proportionate risk should be demonstrated.

Keywords: labelling, OTC, risk, user testing

Procedia PDF Downloads 274
2 “MaxSALIVA-II” Advancing a Nano-Sized Dual-Drug Delivery System for Salivary Gland Radioprotection, Regeneration and Repair in a Head and Neck Cancer Pre-Clinical Murine Model

Authors: Ziyad S. Haidar

Abstract:

Background: Saliva plays a major role in maintaining oral, dental, and general health and well-being; where it normally bathes the oral cavity acting as a clearing agent. This becomes more apparent when the amount and quality of saliva are significantly reduced due to medications, salivary gland neoplasms, disorders such as Sjögren’s syndrome, and especially ionizing radiation therapy for tumors of the head and neck, the 5th most common malignancy worldwide, during which the salivary glands are included within the radiation field/zone. Clinically, patients affected by salivary gland dysfunction often opt to terminate their radiotherapy course prematurely as they become malnourished and experience a significant decrease in their QoL. Accordingly, the formulation of a radio-protection/-prevention modality and development of an alternative Rx to restore damaged salivary gland tissue is eagerly awaited and highly desirable. Objectives: Assess the pre-clinical radio-protective effect and reparative/regenerative potential of layer-by-layer self-assembled lipid-polymer-based core-shell nanocapsules designed and fine-tuned for the sequential (ordered) release of dual cytokines, following a single local administration (direct injection) into a murine sub-mandibular salivary gland model of irradiation. Methods: The formulated core-shell nanocapsules were characterized by physical-chemical-mechanically pre-/post-loading with the drugs, followed by optimizing the pharmaco-kinetic profile. Then, nanosuspensions were administered directly into the salivary glands, 24hrs pre-irradiation (PBS, un-loaded nanocapsules, and individual and combined vehicle-free cytokines were injected into the control glands for an in-depth comparative analysis). External irradiation at an elevated dose of 18Gy was exposed to the head-and-neck region of C57BL/6 mice. Salivary flow rate (un-stimulated) and salivary protein content/excretion were regularly assessed using an enzyme-linked immunosorbent assay (3-month period). Histological and histomorphometric evaluation and apoptosis/proliferation analysis followed by local versus systemic bio-distribution and immuno-histochemical assays were then performed on all harvested major organs (at the distinct experimental end-points). Results: Monodisperse, stable, and cytocompatible nanocapsules capable of maintaining the bioactivity of the encapsulant within the different compartments with the core and shell and with controlled/customizable pharmaco-kinetics, resulted, as is illustrated in the graphical abstract (Figure) below. The experimental animals demonstrated a significant increase in salivary flow rates when compared to the controls. Herein, salivary protein content was comparable to the pre-irradiation (baseline) level. Histomorphometry further confirmed the biocompatibility and localization of the nanocapsules, in vivo, into the site of injection. Acinar cells showed fewer vacuoles and nuclear aberration in the experimental group, while the amount of mucin was higher in controls. Overall, fewer apoptotic activities were detected by a Terminal deoxynucleotidyl Transferase (TdT) dUTP Nick-End Labeling (TUNEL) assay and proliferative rates were similar to the controls, suggesting an interesting reparative and regenerative potential of irradiation-damaged/-dysfunctional salivary glands. The Figure below exemplifies some of these findings. Conclusions: Biocompatible, reproducible, and customizable self-assembling layer-by-layer core-shell delivery system is formulated and presented. Our findings suggest that localized sequential bioactive delivery of dual cytokines (in specific dose and order) can prevent irradiation-induced damage via reducing apoptosis and also has the potential to promote in situ proliferation of salivary gland cells; maxSALIVA is scalable (Good Manufacturing Practice or GMP production for human clinical trials) and patent-pending.

Keywords: cancer, head and neck, oncology, drug development, drug delivery systems, nanotechnology, nanoncology

Procedia PDF Downloads 48
1 “MaxSALIVA”: A Nano-Sized Dual-Drug Delivery System for Salivary Gland Radioprotection and Repair in Head and Neck Cancer

Authors: Ziyad S. Haidar

Abstract:

Background: Saliva plays a major role in maintaining oral and dental health (consequently, general health and well-being). Where it normally bathes the oral cavity and acts as a clearing agent. This becomes more apparent when the amount and quality of salivare significantly reduced due to medications, salivary gland neoplasms, disorders such as Sjögren’s syndrome, and especially ionizing radiation therapy for tumors of the head and neck, the fifth most common malignancy worldwide, during which the salivary glands are included within the radiation field or zone. Clinically, patients affected by salivary gland dysfunction often opt to terminate their radiotherapy course prematurely because they become malnourished and experience a significant decrease in their quality of life. Accordingly, the development of an alternative treatment to restore or regenerate damaged salivary gland tissue is eagerly awaited. Likewise, the formulation of a radioprotection modality and early damage prevention strategy is also highly desirable. Objectives: To assess the pre-clinical radio-protective effect as well as the reparative/regenerative potential of layer-by-layer self-assembled lipid-polymer-based core-shell nanocapsules designed and fine-tuned in this experimental work for the sequential (ordered) release of dual cytokines, following a single local administration (direct injection) into a murine sub-mandibular salivary gland model of irradiation. Methods: The formulated core-shell nanocapsules were characterized by physical-chemical-mechanically pre-/post-loading with the drugs (in solution and powder formats), followed by optimizing the pharmaco-kinetic profile. Then, nanosuspensions were administered directly into the salivary glands, 24hrs pre-irradiation (PBS, un-loaded nanocapsules, and individual and combined vehicle-free cytokines were injected into the control glands for an in-depth comparative analysis). External irradiation at an elevated dose of 18Gy (revised from our previous 15Gy model) was exposed to the head-and-neck region of C57BL/6 mice. Salivary flow rate (un-stimulated) and salivary protein content/excretion were regularly assessed using an enzyme-linked immunosorbent assay (3-month period). Histological and histomorphometric evaluation and apoptosis/proliferation analysis followed by local versus systemic bio-distribution and immuno-histochemical assays were then performed on all harvested major organs (at the distinct experimental end-points). Results: Monodisperse, stable, and cytocompatible nanocapsules capable of maintaining the bioactivity of the encapsulant within the different compartments with the core and shell and with controlled/customizable pharmaco-kinetics, resulted, as is illustrated in the graphical abstract (Figure) below. The experimental animals demonstrated a significant increase in salivary flow rates when compared to the controls. Herein, salivary protein content was comparable to the pre-irradiation (baseline) level. Histomorphometry further confirmed the biocompatibility and localization of the nanocapsules, in vivo, into the site of injection. Acinar cells showed fewer vacuoles and nuclear aberration in the experimental group, while the amount of mucin was higher in controls. Overall, fewer apoptotic activities were detected by a Terminal deoxynucleotidyl Transferase (TdT) dUTP Nick-End Labeling (TUNEL) assay and proliferative rates were similar to the controls, suggesting an interesting reparative and regenerative potential of irradiation-damaged/-dysfunctional salivary glands. The Figure below exemplifies some of these findings. Conclusions: Biocompatible, reproducible, and customizable self-assembling layer-by-layer core-shell delivery system is formulated and presented. Our findings suggest that localized sequential bioactive delivery of dual cytokines (in specific dose and order) can prevent irradiation-induced damage via reducing apoptosis and also has the potential to promote in situ proliferation of salivary gland cells; maxSALIVA is scalable (Good Manufacturing Practice or GMP production for human clinical trials) and patent-pending.

Keywords: saliva, head and neck cancer, nanotechnology, controlled drug delivery, xerostomia, mucositis, biopolymers, innovation

Procedia PDF Downloads 56