Search results for: socioeconomic features
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4421

Search results for: socioeconomic features

611 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment

Authors: Ella Sèdé Maforikan

Abstract:

Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.

Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment

Procedia PDF Downloads 63
610 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data

Authors: Minjuan Sun

Abstract:

Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.

Keywords: credit score, digital footprint, Fintech, machine learning

Procedia PDF Downloads 165
609 Influence of Surface Wettability on Imbibition Dynamics of Protein Solution in Microwells

Authors: Himani Sharma, Amit Agrawal

Abstract:

Stability of the Cassie and Wenzel wetting states depends on intrinsic contact angle and geometric features on a surface that was exploited in capturing biofluids in microwells. However, the mechanism of imbibition of biofluids in the microwells is not well implied in terms of wettability of a substrate. In this work, we experimentally demonstrated filling dynamics in hydrophilic and hydrophobic microwells by protein solutions. Towards this, we utilized lotus leaf as a mold to fabricate microwells on a Polydimethylsiloxane (PDMS) surface. Lotus leaf containing micrometer-sized blunt-conical shaped pillars with a height of 8-15 µm and diameter of 3-8 µm were transferred on to PDMS. Furthermore, PDMS surface was treated with oxygen plasma to render the hydrophilic nature. A 10µL droplets containing fluorescein isothiocyanate (FITC) - labelled bovine serum albumin (BSA) were rested on both hydrophobic (θa = 108o, where θa is the apparent contact angle) and hydrophilic (θa = 60o) PDMS surfaces. A time-dependent fluorescence microscopy was conducted on these modified PDMS surfaces by recording the fluorescent intensity over a 5 minute period. It was observed that, initially (at t=1 min) FITC-BSA was accumulated on the periphery of both hydrophilic and hydrophobic microwells due to incomplete penetration of liquid-gas meniscus. This deposition of FITC-BSA on periphery of microwell was not changed with time for hydrophobic surfaces, whereas, a complete filling was occurred in hydrophilic microwells (at t=5 mins). This attributes to a gradual movement of three-phase contact line along the vertical surface of the hydrophilic microwells as compared to stable pinning in the hydrophobic microwells as confirmed by Surface Evolver simulations. In addition, if the cavities are presented on hydrophobic surfaces, air bubbles will be trapped inside the cavities once the aqueous solution is placed over these surfaces, resulting in the Cassie-Baxter wetting state. This condition hinders trapping of proteins inside the microwells. Thus, it is necessary to impart hydrophilicity to the microwell surfaces so as to induce the Wenzel state, such that, an entire solution will be fully in contact with the walls of microwells. Imbibition of microwells by protein solutions was analyzed in terms fluorescent intensity versus time. The present work underlines the importance of geometry of microwells and surface wettability of substrate in wetting and effective capturing of solid sub-phases in biofluids.

Keywords: BSA, microwells, surface evolver, wettability

Procedia PDF Downloads 200
608 Human Identification Using Local Roughness Patterns in Heartbeat Signal

Authors: Md. Khayrul Bashar, Md. Saiful Islam, Kimiko Yamashita, Yano Midori

Abstract:

Despite having some progress in human authentication, conventional biometrics (e.g., facial features, fingerprints, retinal scans, gait, voice patterns) are not robust against falsification because they are neither confidential nor secret to an individual. As a non-invasive tool, electrocardiogram (ECG) has recently shown a great potential in human recognition due to its unique rhythms characterizing the variability of human heart structures (chest geometry, sizes, and positions). Moreover, ECG has a real-time vitality characteristic that signifies the live signs, which ensure legitimate individual to be identified. However, the detection accuracy of the current ECG-based methods is not sufficient due to a high variability of the individual’s heartbeats at a different instance of time. These variations may occur due to muscle flexure, the change of mental or emotional states, and the change of sensor positions or long-term baseline shift during the recording of ECG signal. In this study, a new method is proposed for human identification, which is based on the extraction of the local roughness of ECG heartbeat signals. First ECG signal is preprocessed using a second order band-pass Butterworth filter having cut-off frequencies of 0.00025 and 0.04. A number of local binary patterns are then extracted by applying a moving neighborhood window along the ECG signal. At each instant of the ECG signal, the pattern is formed by comparing the ECG intensities at neighboring time points with the central intensity in the moving window. Then, binary weights are multiplied with the pattern to come up with the local roughness description of the signal. Finally, histograms are constructed that describe the heartbeat signals of individual subjects in the database. One advantage of the proposed feature is that it does not depend on the accuracy of detecting QRS complex, unlike the conventional methods. Supervised recognition methods are then designed using minimum distance to mean and Bayesian classifiers to identify authentic human subjects. An experiment with sixty (60) ECG signals from sixty adult subjects from National Metrology Institute of Germany (NMIG) - PTB database, showed that the proposed new method is promising compared to a conventional interval and amplitude feature-based method.

Keywords: human identification, ECG biometrics, local roughness patterns, supervised classification

Procedia PDF Downloads 405
607 The Grammar of the Content Plane as a Style Marker in Forensic Authorship Attribution

Authors: Dayane de Almeida

Abstract:

This work aims at presenting a study that demonstrates the usability of categories of analysis from Discourse Semiotics – also known as Greimassian Semiotics in authorship cases in forensic contexts. It is necessary to know if the categories examined in semiotic analysis (the ‘grammar’ of the content plane) can distinguish authors. Thus, a study with 4 sets of texts from a corpus of ‘not on demand’ written samples (those texts differ in formality degree, purpose, addressees, themes, etc.) was performed. Each author contributed with 20 texts, separated into 2 groups of 10 (Author1A, Author1B, and so on). The hypothesis was that texts from a single author were semiotically more similar to each other than texts from different authors. The assumptions and issues that led to this idea are as follows: -The features analyzed in authorship studies mostly relate to the expression plane: they are manifested on the ‘surface’ of texts. If language is both expression and content, content would also have to be considered for more accurate results. Style is present in both planes. -Semiotics postulates the content plane is structured in a ‘grammar’ that underlies expression, and that presents different levels of abstraction. This ‘grammar’ would be a style marker. -Sociolinguistics demonstrates intra-speaker variation: an individual employs different linguistic uses in different situations. Then, how to determine if someone is the author of several texts, distinct in nature (as it is the case in most forensic sets), when it is known intra-speaker variation is dependent on so many factors?-The idea is that the more abstract the level in the content plane, the lower the intra-speaker variation, because there will be a greater chance for the author to choose the same thing. If two authors recurrently chose the same options, differently from one another, it means each one’s option has discriminatory power. -Size is another issue for various attribution methods. Since most texts in real forensic settings are short, methods relying only on the expression plane tend to fail. The analysis of the content plane as proposed by greimassian semiotics would be less size-dependable. -The semiotic analysis was performed using the software Corpus Tool, generating tags to allow the counting of data. Then, similarities and differences were quantitatively measured, through the application of the Jaccard coefficient (a statistical measure that compares the similarities and differences between samples). The results showed the hypothesis was confirmed and, hence, the grammatical categories of the content plane may successfully be used in questioned authorship scenarios.

Keywords: authorship attribution, content plane, forensic linguistics, greimassian semiotics, intraspeaker variation, style

Procedia PDF Downloads 243
606 Biflavonoids from Selaginellaceae as Epidermal Growth Factor Receptor Inhibitors and Their Anticancer Properties

Authors: Adebisi Adunola Demehin, Wanlaya Thamnarak, Jaruwan Chatwichien, Chatchakorn Eurtivong, Kiattawee Choowongkomon, Somsak Ruchirawat, Nopporn Thasana

Abstract:

The epidermal growth factor receptor (EGFR) is a transmembrane glycoprotein involved in cellular signalling processes and, its aberrant activity is crucial in the development of many cancers such as lung cancer. Selaginellaceae are fern allies that have long been used in Chinese traditional medicine to treat various cancer types, especially lung cancer. Biflavonoids, the major secondary metabolites in Selaginellaceae, have numerous pharmacological activities, including anti-cancer and anti-inflammatory. For instance, amentoflavone induces a cytotoxic effect in the human NSCLC cell line via the inhibition of PARP-1. However, to the best of our knowledge, there are no studies on biflavonoids as EGFR inhibitors. Thus, this study aims to investigate the EGFR inhibitory activities of biflavonoids isolated from Selaginella siamensis and Selaginella bryopteris. Amentoflavone, tetrahydroamentoflavone, sciadopitysin, robustaflavone, robustaflavone-4-methylether, delicaflavone, and chrysocauloflavone were isolated from the ethyl-acetate extract of the whole plants. The structures were determined using NMR spectroscopy and mass spectrometry. In vitro study was conducted to evaluate their cytotoxicity against A549, HEPG2, and T47D human cancer cell lines using the MTT assay. In addition, a target-based assay was performed to investigate their EGFR inhibitory activity using the kinase inhibition assay. Finally, a molecular docking study was conducted to predict the binding modes of the compounds. Robustaflavone-4-methylether and delicaflavone showed the best cytotoxic activity on all the cell lines with IC50 (µM) values of 18.9 ± 2.1 and 22.7 ± 3.3 on A549, respectively. Of these biflavonoids, delicaflavone showed the most potent EGFR inhibitory activity with an 84% relative inhibition at 0.02 nM using erlotinib as a positive control. Robustaflavone-4-methylether showed a 78% inhibition at 0.15 nM. The docking scores obtained from the molecular docking study correlated with the kinase inhibition assay. Robustaflavone-4-methylether and delicaflavone had a docking score of 72.0 and 86.5, respectively. The inhibitory activity of delicaflavone seemed to be linked with the C2”=C3” and 3-O-4”’ linkage pattern. Thus, this study suggests that the structural features of these compounds could serve as a basis for developing new EGFR-TK inhibitors.

Keywords: anticancer, biflavonoids, EGFR, molecular docking, Selaginellaceae

Procedia PDF Downloads 198
605 Assessment of the Impact of Atmospheric Air, Drinking Water and Socio-Economic Indicators on the Primary Incidence of Children in Altai Krai

Authors: A. P. Pashkov

Abstract:

The number of environmental factors that adversely affect children's health is growing every year; their combination in each territory is different. The contribution of socio-economic factors to the health status of the younger generation is increasing. It is the child’s body that is most sensitive to changes in environmental conditions, responding to this with a deterioration in health. Over the past years, scientists have determined the influence of environmental factors and the incidence of children. Currently, there is a tendency to study regional characteristics of the interaction of a combination of environmental factors with the child's body. The aim of the work was to identify trends in the primary non-infectious morbidity of the children of the Altai Territory as a unique region that combines territories with different levels of environmental quality indicators, as well as to assess the effect of atmospheric air, drinking water and socio-economic indicators on the incidence of children in the region. An unfavorable tendency has been revealed in the region for incidence of such nosological groups as neoplasms, including malignant ones, diseases of the endocrine system, including obesity and thyroid disease, diseases of the circulatory system, digestive diseases, diseases of the genitourinary system, congenital anomalies, and respiratory diseases. Between some groups of diseases revealed a pattern of geographical distribution during mapping and a significant correlation. Some nosologies have a relationship with socio-economic indicators for an integrated assessment: circulatory system diseases, respiratory diseases (direct connection), endocrine system diseases, eating disorders, and metabolic disorders (feedback). The analysis of associations of the incidence of children with average annual concentrations of substances that pollute the air and drinking water showed the existence of reliable correlation in areas of critical and intense degree of environmental quality. This fact confirms that the population living in contaminated areas is subject to the negative influence of environmental factors, which immediately affects the health status of children. The results obtained indicate the need for a detailed assessment of the influence of environmental factors on the incidence of children in the regional aspect, the formation of a database, and the development of automated programs that can predict the incidence in each specific territory. This will increase the effectiveness, including economic of preventive measures.

Keywords: incidence of children, regional features, socio-economic factors, environmental factors

Procedia PDF Downloads 115
604 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine

Procedia PDF Downloads 126
603 A Virtual Set-Up to Evaluate Augmented Reality Effect on Simulated Driving

Authors: Alicia Yanadira Nava Fuentes, Ilse Cervantes Camacho, Amadeo José Argüelles Cruz, Ana María Balboa Verduzco

Abstract:

Augmented reality promises being present in future driving, with its immersive technology let to show directions and maps to identify important places indicating with graphic elements when the car driver requires the information. On the other side, driving is considered a multitasking activity and, for some people, a complex activity where different situations commonly occur that require the immediate attention of the car driver to make decisions that contribute to avoid accidents; therefore, the main aim of the project is the instrumentation of a platform with biometric sensors that allows evaluating the performance in driving vehicles with the influence of augmented reality devices to detect the level of attention in drivers, since it is important to know the effect that it produces. In this study, the physiological sensors EPOC X (EEG), ECG06 PRO and EMG Myoware are joined in the driving test platform with a Logitech G29 steering wheel and the simulation software City Car Driving in which the level of traffic can be controlled, as well as the number of pedestrians that exist within the simulation obtaining a driver interaction in real mode and through a MSP430 microcontroller achieves the acquisition of data for storage. The sensors bring a continuous analog signal in time that needs signal conditioning, at this point, a signal amplifier is incorporated due to the acquired signals having a sensitive range of 1.25 mm/mV, also filtering that consists in eliminating the frequency bands of the signal in order to be interpretative and without noise to convert it from an analog signal into a digital signal to analyze the physiological signals of the drivers, these values are stored in a database. Based on this compilation, we work on the extraction of signal features and implement K-NN (k-nearest neighbor) classification methods and decision trees (unsupervised learning) that enable the study of data for the identification of patterns and determine by classification methods different effects of augmented reality on drivers. The expected results of this project include are a test platform instrumented with biometric sensors for data acquisition during driving and a database with the required variables to determine the effect caused by augmented reality on people in simulated driving.

Keywords: augmented reality, driving, physiological signals, test platform

Procedia PDF Downloads 142
602 A Corpus-Based Contrastive Analysis of Directive Speech Act Verbs in English and Chinese Legal Texts

Authors: Wujian Han

Abstract:

In the process of human interaction and communication, speech act verbs are considered to be the most active component and the main means for information transmission, and are also taken as an indication of the structure of linguistic behavior. The theoretical value and practical significance of such everyday built-in metalanguage have long been recognized. This paper, which is part of a bigger study, is aimed to provide useful insights for a more precise and systematic application to speech act verbs translation between English and Chinese, especially with regard to the degree to which generic integrity is maintained in the practice of translation of legal documents. In this study, the corpus, i.e. Chinese legal texts and their English translations, English legal texts, ordinary Chinese texts, and ordinary English texts, serve as a testing ground for examining contrastively the usage of English and Chinese directive speech act verbs in legal genre. The scope of this paper is relatively wide and essentially covers all directive speech act verbs which are used in ordinary English and Chinese, such as order, command, request, prohibit, threat, advice, warn and permit. The researcher, by combining the corpus methodology with a contrastive perspective, explored a range of characteristics of English and Chinese directive speech act verbs including their semantic, syntactic and pragmatic features, and then contrasted them in a structured way. It has been found that there are similarities between English and Chinese directive speech act verbs in legal genre, such as similar semantic components between English speech act verbs and their translation equivalents in Chinese, formal and accurate usage of English and Chinese directive speech act verbs in legal contexts. But notable differences have been identified in areas of difference between their usage in the original Chinese and English legal texts such as valency patterns and frequency of occurrences. For example, the subjects of some directive speech act verbs are very frequently omitted in Chinese legal texts, but this is not the case in English legal texts. One of the practicable methods to achieve adequacy and conciseness in speech act verb translation from Chinese into English in legal genre is to repeat the subjects or the message with discrepancy, and vice versa. In addition, translation effects such as overuse and underuse of certain directive speech act verbs are also found in the translated English texts compared to the original English texts. Legal texts constitute a particularly valuable material for speech act verb study. Building up such a contrastive picture of the Chinese and English speech act verbs in legal language would yield results of value and interest to legal translators and students of language for legal purposes and have practical application to legal translation between English and Chinese.

Keywords: contrastive analysis, corpus-based, directive speech act verbs, legal texts, translation between English and Chinese

Procedia PDF Downloads 501
601 From Abraham to Average Man: Game Theoretic Analysis of Divine Social Relationships

Authors: Elizabeth Latham

Abstract:

Billions of people worldwide profess some feeling of psychological or spiritual connection with the divine. The majority of them attribute this personal connection to the God of the Christian Bible. The objective of this research was to discover what could be known about the exact social nature of these relationships and to see if they mimic the interactions recounted in the bible; if a worldwide majority believes that the Christian Bible is a true account of God’s interactions with mankind, it is reasonable to assume that the interactions between God and the aforementioned people would be similar to the ones in the bible. This analysis required the employment of an unusual method of biblical analysis: Game Theory. Because the research focused on documented social interaction between God and man in scripture, it was important to go beyond text-analysis methods. We used stories from the New Revised Standard Version of the bible to set up “games” using economics-style matrices featuring each player’s motivations and possible courses of action, modeled after interactions in the Old and New Testaments between the Judeo-Christian God and some mortal person. We examined all relevant interactions for the objectives held by each party and their strategies for obtaining them. These findings were then compared to similar “games” created based on interviews with people subscribing to different levels of Christianity who ranged from barely-practicing to clergymen. The range was broad so as to look for a correlation between scriptural knowledge and game-similarity to the bible. Each interview described a personal experience someone believed they had with God and matrices were developed to describe each one as social interaction: a “game” to be analyzed quantitively. The data showed that in most cases, the social features of God-man interactions in the modern lives of people were like those present in the “games” between God and man in the bible. This similarity was referred to in the study as “biblical faith” and it alone was a fascinating finding with many implications. The even more notable finding, however, was that the amount of game-similarity present did not correlate with the amount of scriptural knowledge. Each participant was also surveyed on family background, political stances, general education, scriptural knowledge, and those who had biblical faith were not necessarily the ones that knew the bible best. Instead, there was a high degree of correlation between biblical faith and family religious observance. It seems that to have a biblical psychological relationship with God, it is more important to have a religious family than to have studied scripture, a surprising insight with massive implications on the practice and preservation of religion.

Keywords: bible, Christianity, game theory, social psychology

Procedia PDF Downloads 157
600 A Method to Evaluate and Compare Web Information Extractors

Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman

Abstract:

Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.

Keywords: web information extractors, information extraction evaluation method, Google scholar, web

Procedia PDF Downloads 248
599 Humanistic Psychology Workshop to Increase Psychological Well-Being

Authors: Nidia Thalia Alva Rangel, Ferran Padros Blazquez, Ma. Ines Gomez Del Campo Del Paso

Abstract:

Happiness has been since antiquity a concept of interest around the world. Positive psychology is the science that begins to study happiness in a more precise and controlled way, obtaining wide amount of research which can be applied. One of the central constructs of Positive Psychology is Carol Ryff’s psychological well-being model as eudaimonic happiness, which comprehends six dimensions: autonomy, environmental mastery, personal growth, positive relations with others, purpose in life, and self-acceptance. Humanistic psychology is a clear precedent of Positive Psychology, which has studied human development topics and it features a great variety of intervention techniques nevertheless has little evidence with controlled research. Therefore, the present research had the aim to evaluate the efficacy of a humanistic intervention program to increase psychological well-being in healthy adults through a mixed methods study. Before and after the intervention, it was applied Carol Ryff’s psychological well-being scale (PWBS) and the Symptom Check List 90 as pretest and posttest. In addition, a questionnaire of five open questions was applied after each session. The intervention program was designed in experiential workshop format, based on the foundational attitudes defined by Carl Rogers: congruence, unconditional positive regard and empathy, integrating humanistic intervention strategies from gestalt, psychodrama, logotherapy and psychological body therapy, with the aim to strengthen skills in the six dimensions of psychological well-being model. The workshop was applied to six volunteer adults in 12 sessions of 2 hours each. Finally, quantitative data were analyzed with Wilcoxon statistic test through the SPSS program, obtaining as results differences statistically significant in pathology symptoms between prettest and postest, also levels of dimensions of psychological well-being were increased, on the other hand for qualitative strand, by open questionnaires it showed how the participants were experiencing the techniques and changing through the sessions. Thus, the humanistic psychology program was effective to increase psychological well-being. Working to promote well-being prompts to be an effective way to reduce pathological symptoms as a secondary gain. Experiential workshops are a useful tool for small groups. There exists the need for research to count with more evidence of humanistic psychology interventions in different contexts and impulse the application of Positive Psychology knowledge.

Keywords: happiness, humanistic psychology, positive psychology, psychological well-being, workshop

Procedia PDF Downloads 416
598 Linguistic Analysis of Argumentation Structures in Georgian Political Speeches

Authors: Mariam Matiashvili

Abstract:

Argumentation is an integral part of our daily communications - formal or informal. Argumentative reasoning, techniques, and language tools are used both in personal conversations and in the business environment. Verbalization of the opinions requires the use of extraordinary syntactic-pragmatic structural quantities - arguments that add credibility to the statement. The study of argumentative structures allows us to identify the linguistic features that make the text argumentative. Knowing what elements make up an argumentative text in a particular language helps the users of that language improve their skills. Also, natural language processing (NLP) has become especially relevant recently. In this context, one of the main emphases is on the computational processing of argumentative texts, which will enable the automatic recognition and analysis of large volumes of textual data. The research deals with the linguistic analysis of the argumentative structures of Georgian political speeches - particularly the linguistic structure, characteristics, and functions of the parts of the argumentative text - claims, support, and attack statements. The research aims to describe the linguistic cues that give the sentence a judgmental/controversial character and helps to identify reasoning parts of the argumentative text. The empirical data comes from the Georgian Political Corpus, particularly TV debates. Consequently, the texts are of a dialogical nature, representing a discussion between two or more people (most often between a journalist and a politician). The research uses the following approaches to identify and analyze the argumentative structures Lexical Classification & Analysis - Identify lexical items that are relevant in argumentative texts creating process - Creating the lexicon of argumentation (presents groups of words gathered from a semantic point of view); Grammatical Analysis and Classification - means grammatical analysis of the words and phrases identified based on the arguing lexicon. Argumentation Schemas - Describe and identify the Argumentation Schemes that are most likely used in Georgian Political Speeches. As a final step, we analyzed the relations between the above mentioned components. For example, If an identified argument scheme is “Argument from Analogy”, identified lexical items semantically express analogy too, and they are most likely adverbs in Georgian. As a result, we created the lexicon with the words that play a significant role in creating Georgian argumentative structures. Linguistic analysis has shown that verbs play a crucial role in creating argumentative structures.

Keywords: georgian, argumentation schemas, argumentation structures, argumentation lexicon

Procedia PDF Downloads 74
597 Experimental Research of High Pressure Jet Interaction with Supersonic Crossflow

Authors: Bartosz Olszanski, Zbigniew Nosal, Jacek Rokicki

Abstract:

An experimental study of cold-jet (nitrogen) reaction control jet system has been carried out to investigate the flow control efficiency for low to moderate jet pressure ratios (total jet pressure p0jet over free stream static pressure in the wind tunnel p∞) and different angles of attack for infinite Mach number equal to 2. An investigation of jet influence was conducted on a flat plate geometry placed in the test section of intermittent supersonic wind tunnel of Department of Aerodynamics, WUT. Various convergent jet nozzle geometries to obtain different jet momentum ratios were tested on the same test model geometry. Surface static pressure measurements, Schlieren flow visualizations (using continuous and photoflash light source), load cell measurements gave insight into the supersonic crossflow interaction for different jet pressure and jet momentum ratios and their influence on the efficiency of side jet control as described by the amplification factor (actual to theoretical net force generated by the control nozzle). Moreover, the quasi-steady numerical simulations of flow through the same wind tunnel geometry (convergent-divergent nozzle plus test section) were performed using ANSYS Fluent basing on Reynolds-Averaged Navier-Stokes (RANS) solver incorporated with k-ω Shear Stress Transport (SST) turbulence model to assess the possible spurious influence of test section walls over the jet exit near field area of interest. The strong bow shock, barrel shock, and Mach disk as well as lambda separation region in front of nozzle were observed as images taken by high-speed camera examine the interaction of the jet and the free stream. In addition, the development of large-scale vortex structures (counter-rotating vortex pair) was detected. The history of complex static pressure pattern on the plate was recorded and compared to the force measurement data as well as numerical simulation data. The analysis of the obtained results, especially in the wake of the jet showed important features of the interaction mechanisms between the lateral jet and the flow field.

Keywords: flow visualization techniques, pressure measurements, reaction control jet, supersonic cross flow

Procedia PDF Downloads 300
596 Switching of Series-Parallel Connected Modules in an Array for Partially Shaded Conditions in a Pollution Intensive Area Using High Powered MOSFETs

Authors: Osamede Asowata, Christo Pienaar, Johan Bekker

Abstract:

Photovoltaic (PV) modules may become a trend for future PV systems because of their greater flexibility in distributed system expansion, easier installation due to their nature, and higher system-level energy harnessing capabilities under shaded or PV manufacturing mismatch conditions. This is as compared to the single or multi-string inverters. Novel residential scale PV arrays are commonly connected to the grid by a single DC–AC inverter connected to a series, parallel or series-parallel string of PV panels, or many small DC–AC inverters which connect one or two panels directly to the AC grid. With an increasing worldwide interest in sustainable energy production and use, there is renewed focus on the power electronic converter interface for DC energy sources. Three specific examples of such DC energy sources that will have a role in distributed generation and sustainable energy systems are the photovoltaic (PV) panel, the fuel cell stack, and batteries of various chemistries. A high-efficiency inverter using Metal Oxide Semiconductor Field-Effect Transistors (MOSFETs) for all active switches is presented for a non-isolated photovoltaic and AC-module applications. The proposed configuration features a high efficiency over a wide load range, low ground leakage current and low-output AC-current distortion with no need for split capacitors. The detailed power stage operating principles, pulse width modulation scheme, multilevel bootstrap power supply, and integrated gate drivers for the proposed inverter is described. Experimental results of a hardware prototype, show that not only are MOSFET efficient in the system, it also shows that the ground leakage current issues are alleviated in the proposed inverter and also a 98 % maximum associated driver circuit is achieved. This, in turn, provides the need for a possible photovoltaic panel switching technique. This will help to reduce the effect of cloud movements as well as improve the overall efficiency of the system.

Keywords: grid connected photovoltaic (PV), Matlab efficiency simulation, maximum power point tracking (MPPT), module integrated converters (MICs), multilevel converter, series connected converter

Procedia PDF Downloads 127
595 Micro-Milling Process Development of Advanced Materials

Authors: M. A. Hafiz, P. T. Matevenga

Abstract:

Micro-level machining of metals is a developing field which has shown to be a prospective approach to produce features on the parts in the range of a few to a few hundred microns with acceptable machining quality. It is known that the mechanics (i.e. the material removal mechanism) of micro-machining and conventional machining have significant differences due to the scaling effects associated with tool-geometry, tool material and work piece material characteristics. Shape memory alloys (SMAs) are those metal alloys which display two exceptional properties, pseudoelasticity and the shape memory effect (SME). Nickel-titanium (NiTi) alloys are one of those unique metal alloys. NiTi alloys are known to be difficult-to-cut materials specifically by using conventional machining techniques due to their explicit properties. Their high ductility, high amount of strain hardening, and unusual stress–strain behaviour are the main properties accountable for their poor machinability in terms of tool wear and work piece quality. The motivation of this research work was to address the challenges and issues of micro-machining combining with those of machining of NiTi alloy which can affect the desired performance level of machining outputs. To explore the significance of range of cutting conditions on surface roughness and tool wear, machining tests were conducted on NiTi. Influence of different cutting conditions and cutting tools on surface and sub-surface deformation in work piece was investigated. Design of experiments strategy (L9 Array) was applied to determine the key process variables. The dominant cutting parameters were determined by analysis of variance. These findings showed that feed rate was the dominant factor on surface roughness whereas depth of cut found to be dominant factor as far as tool wear was concerned. The lowest surface roughness was achieved at the feed rate of equal to the cutting edge radius where as the lowest flank wear was observed at lowest depth of cut. Repeated machining trials have yet to be carried out in order to observe the tool life, sub-surface deformation and strain induced hardening which are also expecting to be amongst the critical issues in micro machining of NiTi. The machining performance using different cutting fluids and strategies have yet to be studied.

Keywords: nickel titanium, micro-machining, surface roughness, machinability

Procedia PDF Downloads 340
594 Library Support for the Intellectually Disabled: Book Clubs and Universal Design

Authors: Matthew Conner, Leah Plocharczyk

Abstract:

This study examines the role of academic libraries in support of the intellectually disabled (ID) in post-secondary education. With the growing public awareness of the ID, there has been recognition of their need for post-secondary educational opportunities. This was an unforeseen result for a population that has been associated with elementary levels of education, yet the reasons are compelling. After aging out of the school system, the ID need and deserve educational and social support as much as anyone. Moreover, the commitment to diversity in higher education rings hollow if this group is excluded. Yet, challenges remain to integrating the ID into a college curriculum. This presentation focuses on the role of academic libraries. Neglecting this vital resource for the support of the ID is not to be thought of, yet the library’s contribution is not clear. Library collections presume reading ability and libraries already struggle to meet their traditional goals with the resources available. This presentation examines how academic libraries can support post-secondary ID. For context, the presentation first examines the state of post-secondary education for the ID with an analysis of data on the United States compiled by the ThinkCollege! Project. Geographic Information Systems (GIS) and statistical analysis will show regional and methodological trends in post-secondary support of the ID which currently lack any significant involvement by college libraries. Then, the presentation analyzes a case study of a book club at the Florida Atlantic University (FAU) libraries which has run for several years. Issues such as the selection of books, effective pedagogies, and evaluation procedures will be examined. The study has found that the instruction pedagogies used by libraries can be extended through concepts of Universal Learning Design (ULD) to effectively engage the ID. In particular, student-centered, participatory methodologies that accommodate different learning styles have proven to be especially useful. The choice of text is complex and determined not only by reading ability but familiarity of subject and features of the ID’s developmental trajectory. The selection of text is not only a necessity but also promises to give insight into the ID. Assessment remains a complex and unresolved subject, but the voluntary, sustained, and enthusiastic attendance of the ID is an undeniable indicator. The study finds that, through the traditional library vehicle of the book club, academic libraries can support ID students through training in both reading and socialization, two major goals of their post-secondary education.

Keywords: academic libraries, intellectual disability, literacy, post-secondary education

Procedia PDF Downloads 164
593 Interventions for Children with Autism Using Interactive Technologies

Authors: Maria Hopkins, Sarah Koch, Fred Biasini

Abstract:

Autism is lifelong disorder that affects one out of every 110 Americans. The deficits that accompany Autism Spectrum Disorders (ASD), such as abnormal behaviors and social incompetence, often make it extremely difficult for these individuals to gain functional independence from caregivers. These long-term implications necessitate an immediate effort to improve social skills among children with an ASD. Any technology that could teach individuals with ASD necessary social skills would not only be invaluable for the individuals affected, but could also effect a massive saving to society in treatment programs. The overall purpose of the first study was to develop, implement, and evaluate an avatar tutor for social skills training in children with ASD. “Face Say” was developed as a colorful computer program that contains several different activities designed to teach children specific social skills, such as eye gaze, joint attention, and facial recognition. The children with ASD were asked to attend to FaceSay or a control painting computer game for six weeks. Children with ASD who received the training had an increase in emotion recognition, F(1, 48) = 23.04, p < 0.001 (adjusted Ms 8.70 and 6.79, respectively) compared to the control group. In addition, children who received the FaceSay training had higher post-test scored in facial recognition, F(1, 48) = 5.09, p < 0.05 (adjusted Ms: 38.11 and 33.37, respectively) compared to controls. The findings provide information about the benefits of computer-based training for children with ASD. Recent research suggests the value of also using socially assistive robots with children who have an ASD. Researchers investigating robots as tools for therapy in ASD have reported increased engagement, increased levels of attention, and novel social behaviors when robots are part of the social interaction. The overall goal of the second study was to develop a social robot designed to teach children specific social skills such as emotion recognition. The robot is approachable, with both an animal-like appearance and features of a human face (i.e., eyes, eyebrows, mouth). The feasibility of the robot is being investigated in children ages 7-12 to explore whether the social robot is capable of forming different facial expressions to accurately display emotions similar to those observed in the human face. The findings of this study will be used to create a potentially effective and cost efficient therapy for improving the cognitive-emotional skills of children with autism. Implications and study findings using the robot as an intervention tool will be discussed.

Keywords: autism, intervention, technology, emotions

Procedia PDF Downloads 382
592 The Geometrical Cosmology: The Projective Cast of the Collective Subjectivity of the Chinese Traditional Architectural Drawings

Authors: Lina Sun

Abstract:

Chinese traditional drawings related to buildings and construction apply a unique geometry differentiating with western Euclidean geometry and embrace a collection of special terminologies, under the category of tu (the Chinese character for drawing). This paper will on one side etymologically analysis the terminologies of Chinese traditional architectural drawing, and on the other side geometrically deconstruct the composition of tu and locate the visual narrative language of tu in the pictorial tradition. The geometrical analysis will center on selected series of Yang-shi-lei tu of the construction of emperors’ mausoleums in Qing Dynasty (1636-1912), and will also draw out the earlier architectural drawings and the architectural paintings such as the jiehua, and paintings on religious frescoes and tomb frescoes as the comparison. By doing these, this research will reveal that both the terminologies corresponding to different geometrical forms respectively indicate associations between architectural drawing and the philosophy of Chinese cosmology, and the arrangement of the geometrical forms in the visual picture plane facilitates expressions of the concepts of space and position in the geometrical cosmology. These associations and expressions are the collective intentions of architectural drawing evolving in the thousands of years’ tradition without breakage and irrelevant to the individual authorship. Moreover, the architectural tu itself as an entity, not only functions as the representation of the buildings but also express intentions and strengthen them by using the Chinese unique geometrical language flexibly and intentionally. These collective cosmological spatial intentions and the corresponding geometrical words and languages reveal that the Chinese traditional architectural drawing functions as a unique architectural site with subjectivity which exists parallel with buildings and express intentions and meanings by itself. The methodology and the findings of this research will, therefore, challenge the previous researches which treat architectural drawings just as the representation of buildings and understand the drawings more than just using them as the evidence to reconstruct the information of buildings. Furthermore, this research will situate architectural drawing in between the researches of Chinese technological tu and artistic painting, bridging the two academic areas which usually treated the partial features of architectural drawing separately. Beyond this research, the collective subjectivity of the Chinese traditional drawings will facilitate the revealing of the transitional experience from traditions to drawing modernity, where the individual subjective identities and intentions of architects arise. This research will root for the understanding both the ambivalence and affinity of the drawing modernity encountering the traditions.

Keywords: Chinese traditional architectural drawing (tu), etymology of tu, collective subjectivity of tu, geometrical cosmology in tu, geometry and composition of tu, Yang-shi-lei tu

Procedia PDF Downloads 123
591 CRISPR-Mediated Genome Editing for Yield Enhancement in Tomato

Authors: Aswini M. S.

Abstract:

Tomato (Solanum lycopersicum L.) is one of the most significant vegetable crops in terms of its economic benefits. Both fresh and processed tomatoes are consumed. Tomatoes have a limited genetic base, which makes breeding extremely challenging. Plant breeding has become much simpler and more effective with genome editing tools of CRISPR and CRISPR-associated 9 protein (CRISPR/Cas9), which address the problems with traditional breeding, chemical/physical mutagenesis, and transgenics. With the use of CRISPR/Cas9, a number of tomato traits have been functionally distinguished and edited. These traits include plant architecture as well as flower characters (leaf, flower, male sterility, and parthenocarpy), fruit ripening, quality and nutrition (lycopene, carotenoid, GABA, TSS, and shelf-life), disease resistance (late blight, TYLCV, and powdery mildew), tolerance to abiotic stress (heat, drought, and salinity) and resistance to herbicides. This study explores the potential of CRISPR/Cas9 genome editing for enhancing yield in tomato plants. The study utilized the CRISPR/Cas9 genome editing technology to functionally edit various traits in tomatoes. The de novo domestication of elite features from wild cousins to cultivated tomatoes and vice versa has been demonstrated by the introgression of CRISPR/Cas9. The CycB (Lycopene beta someri) gene-mediated Cas9 editing increased the lycopene content in tomato. Also, Cas9-mediated editing of the AGL6 (Agamous-like 6) gene resulted in parthenocarpic fruit development under heat-stress conditions. The advent of CRISPR/Cas has rendered it possible to use digital resources for single guide RNA design and multiplexing, cloning (such as Golden Gate cloning, GoldenBraid, etc.), creating robust CRISPR/Cas constructs, and implementing effective transformation protocols like the Agrobacterium and DNA free protoplast method for Cas9-gRNAs ribonucleoproteins (RNPs) complex. Additionally, homologous recombination (HR)-based gene knock-in (HKI) via geminivirus replicon and base/prime editing (Target-AID technology) remains possible. Hence, CRISPR/Cas facilitates fast and efficient breeding in the improvement of tomatoes.

Keywords: CRISPR-Cas, biotic and abiotic stress, flower and fruit traits, genome editing, polygenic trait, tomato and trait introgression

Procedia PDF Downloads 71
590 Advancing Microstructure Evolution in Tungsten Through Rolling in Laser Powder Bed Fusion

Authors: Narges Shayesteh Moghaddam

Abstract:

Tungsten (W), a refractory metal known for its remarkably high melting temperature, offers tremendous potential for use in challenging environments prevalent in sectors such as space exploration, defense, and nuclear industries. Additive manufacturing, especially the Laser Powder-Bed Fusion (LPBF) technique, emerges as a beneficial method for fabricating tungsten parts. This technique enables the production of intricate components while simultaneously reducing production lead times and associated costs. However, the inherent brittleness of tungsten and its tendency to crack under high-temperature conditions pose significant challenges to the manufacturing process. Our research primarily focuses on the process of rolling tungsten parts in a layer-by-layer manner in LPBF and the subsequent changes in microstructure. Our objective is not only to identify the alterations in the microstructure but also to assess their implications on the physical properties and performance of the fabricated tungsten parts. To examine these aspects, we conducted an extensive series of experiments that included the fabrication of tungsten samples through LPBF and subsequent characterization using advanced materials analysis techniques. These investigations allowed us to scrutinize shifts in various microstructural features, including, but not limited to, grain size and grain boundaries occurring during the rolling process. The results of our study provide crucial insights into how specific factors, such as plastic deformation occurring during the rolling process, influence the microstructural characteristics of the fabricated parts. This information is vital as it provides a foundation for understanding how the parameters of the layer-by-layer rolling process affect the final tungsten parts. Our research significantly broadens the current understanding of microstructural evolution in tungsten parts produced via the layer-by-layer rolling process in LPBF. The insights obtained will play a pivotal role in refining and optimizing manufacturing parameters, thus improving the mechanical properties of tungsten parts and, therefore, enhancing their performance. Furthermore, these findings will contribute to the advancement of manufacturing techniques, facilitating the wider application of tungsten parts in various high-demand sectors. Through these advancements, this research represents a significant step towards harnessing the full potential of tungsten in high-temperature and high-stress applications.

Keywords: additive manufacturing, rolling, tungsten, refractory materials

Procedia PDF Downloads 99
589 An Advanced Numerical Tool for the Design of Through-Thickness Reinforced Composites for Electrical Applications

Authors: Bing Zhang, Jingyi Zhang, Mudan Chen

Abstract:

Fibre-reinforced polymer (FRP) composites have been extensively utilised in various industries due to their high specific strength, e.g., aerospace, renewable energy, automotive, and marine. However, they have relatively low electrical conductivity than metals, especially in the out-of-plane direction. Conductive metal strips or meshes are typically employed to protect composites when designing lightweight structures that may be subjected to lightning strikes, such as composite wings. Unfortunately, this approach downplays the lightweight advantages of FRP composites, thereby limiting their potential applications. Extensive studies have been undertaken to improve the electrical conductivity of FRP composites. The authors are amongst the pioneers who use through-thickness reinforcement (TTR) to tailor the electrical conductivity of composites. Compared to the conventional approaches using conductive fillers, the through-thickness reinforcement approach has been proven to be able to offer a much larger improvement to the through-thickness conductivity of composites. In this study, an advanced high-fidelity numerical modelling strategy is presented to investigate the effects of through-thickness reinforcement on both the in-plane and out-of-plane electrical conductivities of FRP composites. The critical micro-structural features of through-thickness reinforced composites incorporated in the modelling framework are 1) the fibre waviness formed due to TTR insertion; 2) the resin-rich pockets formed due to resin flow in the curing process following TTR insertion; 3) the fibre crimp, i.e., fibre distortion in the thickness direction of composites caused by TTR insertion forces. In addition, each interlaminar interface is described separately. An IMA/M21 composite laminate with a quasi-isotropic stacking sequence is employed to calibrate and verify the modelling framework. The modelling results agree well with experimental measurements for bothering in-plane and out-plane conductivities. It has been found that the presence of conductive TTR can increase the out-of-plane conductivity by around one order, but there is less improvement in the in-plane conductivity, even at the TTR areal density of 0.1%. This numerical tool provides valuable references as a design tool for through-thickness reinforced composites when exploring their electrical applications. Parametric studies are undertaken using the numerical tool to investigate critical parameters that affect the electrical conductivities of composites, including TTR material, TTR areal density, stacking sequence, and interlaminar conductivity. Suggestions regarding the design of electrical through-thickness reinforced composites are derived from the numerical modelling campaign.

Keywords: composite structures, design, electrical conductivity, numerical modelling, through-thickness reinforcement

Procedia PDF Downloads 89
588 Corporate Social Responsibility in the Libyan Commercial Banks: Reality and Issues

Authors: Khalid Alshaikh

Abstract:

Corporate Social Responsibility (CSR) in Libya has recently gained momentum, especially with the rise of the social issues ensued by the recent war. CSR is a new organisational culture designing its features and route within the Libyan financial institutions. Now, both the public and private banks invest in this construct trusting that its powers are capable of improving the economic, social and environmental problems the conflict has created. On the other hand, the Libyan commercial banks recognise the benefits of utilising CSR to entice investors and ensure their continuations in the national and international markets. Nevertheless, as a new concept, CSR necessitates an in-depth exploration and analysis to help its transition from the margins of religion to the mainstream of society and businesses. This can assist in constructing its activities to bring about change nation-wide. Therefore, this paper intends to explore the current definitions attached to this term through tracing back its historical beginnings. Then, it investigates its trends both in the public and private banks to identify where its sustainable development materialises. Lastly, it seeks to understand the key challenges that obscure its success in the Libyan environment. The research methodology used both public and private banks as case study and qualitative research to interview ten Board of Directors (BoDs) and eleven Chief Executive Managers (CEOs) to discover how CSR is defined and the core CSR activities practiced by the Libyan Commercial Banks (LCBs) and the key constraints that CSR faces and make it unsuccessful. The findings suggest that CSR is still influenced by the power of religion. Nevertheless, the Islamic perspective is more consistent with the social contract concept of CSR. The LCBs do not solely focus on the economic side of maximizing profits, but also concentrate on its morality. The issue is that CSR activities are not enough to achieve good charity publicly and needs strategies to address major social issues. Moreover, shareholders do not support CSR activities. Their argument is that the only social responsibility of businesses is to maximize profits, while the government should deal with the existing social issues. Finally, although the LCBs endeavour to embed CSR in their organisational culture, it is still important that different stakeholders need to do much more to entrench this construct through their core functions. The Central bank of Libya needs also to boost its standing to be more influential and ensure that the right discussions about CSR happen with the right stakeholders involved.

Keywords: corporate social responsibility, private banks, public banks, stakeholders

Procedia PDF Downloads 189
587 N-Glycosylation in the Green Microalgae Chlamydomonas reinhardtii

Authors: Pierre-Louis Lucas, Corinne Loutelier-Bourhis, Narimane Mati-Baouche, Philippe Chan Tchi-Song, Patrice Lerouge, Elodie Mathieu-Rivet, Muriel Bardor

Abstract:

N-glycosylation is a post-translational modification taking place in the Endoplasmic Reticulum and the Golgi apparatus where defined glycan features are added on protein in a very specific sequence Asn-X-Thr/Ser/Cys were X can be any amino acid except proline. Because it is well-established that those N-glycans play a critical role in protein biological activity, protein half-life and that a different N-glycan structure may induce an immune response, they are very important in Biopharmaceuticals which are mainly glycoproteins bearing N-glycans. From now, most of the biopharmaceuticals are produced by mammalian cells like Chinese Hamster Ovary cells (CHO) for their N-glycosylation similar to the human, but due to the high production costs, several other species are investigated as the possible alternative system. In this purpose, the green microalgae Chlamydomonas reinhardtii was investigated as the potential production system for Biopharmaceuticals. This choice was influenced by the facts that C. reinhardtii is a well-study microalgae which is growing fast with a lot of molecular biology tools available. This organism is also producing N-glycan on its endogenous proteins. However, the analysis of the N-glycan structure of this microalgae has revealed some differences as compared to the human. Rather than in Human where the glycans are processed by key enzymes called N-acetylglucosaminyltransferase I and II (GnTI and GnTII) adding GlcNAc residue to form a GlcNAc₂Man₃GlcNAc₂ core N-glycan, C. reinhardtii lacks those two enzymes and possess a GnTI independent glycosylation pathway. Moreover, some enzymes like xylosyltransferases and methyltransferases not present in human are supposed to act on the glycans of C. reinhardtii. Furthermore, the recent structural study by mass spectrometry shows that the N-glycosylation precursor supposed to be conserved in almost all eukaryotic cells results in a linear Man₅GlcNAc₂ rather than a branched one in C. reinhardtii. In this work, we will discuss the new released MS information upon C. reinhardtii N-glycan structure and their impact on our attempt to modify the glycan in a Human manner. Two strategies will be discussed. The first one consisted in the study of Xylosyltransferase insertional mutants from the CLIP library in order to remove xyloses from the N-glycans. The second will go further in the humanization by transforming the microalgae with the exogenous gene from Toxoplasma gondii having an activity similar to GnTI and GnTII with the aim to synthesize GlcNAc₂Man₃GlcNAc₂ in C. reinhardtii.

Keywords: Chlamydomonas reinhardtii, N-glycosylation, glycosyltransferase, mass spectrometry, humanization

Procedia PDF Downloads 178
586 Smart Irrigation Systems and Website: Based Platform for Farmer Welfare

Authors: Anusha Jain, Santosh Vishwanathan, Praveen K. Gupta, Shwetha S., Kavitha S. N.

Abstract:

Agriculture has a major impact on the Indian economy, with the highest employment ratio than any sector of the country. Currently, most of the traditional agricultural practices and farming methods are manual, which results in farmers not realizing their maximum productivity often due to increasing in labour cost, inefficient use of water sources leading to wastage of water, inadequate soil moisture content, subsequently leading to food insecurity of the country. This research paper aims to solve this problem by developing a full-fledged web application-based platform that has the capacity to associate itself with a Microcontroller-based Automated Irrigation System which schedules the irrigation of crops based on real-time soil moisture content employing soil moisture sensors centric to the crop’s requirements using WSN (Wireless Sensor Networks) and M2M (Machine To Machine Communication) concepts, thus optimizing the use of the available limited water resource, thereby maximizing the crop yield. This robust automated irrigation system provides end-to-end automation of Irrigation of crops at any circumstances such as droughts, irregular rainfall patterns, extreme weather conditions, etc. This platform will also be capable of achieving a nationwide united farming community and ensuring the welfare of farmers. This platform is designed to equip farmers with prerequisite knowledge on tech and the latest farming practices in general. In order to achieve this, the MailChimp mailing service is used through which interested farmers/individuals' email id will be recorded and curated articles on innovations in the world of agriculture will be provided to the farmers via e-mail. In this proposed system, service is enabled on the platform where nearby crop vendors will be able to enter their pickup locations, accepted prices and other relevant information. This will enable farmers to choose their vendors wisely. Along with this, we have created a blogging service that will enable farmers and agricultural enthusiasts to share experiences, helpful knowledge, hardships, etc., with the entire farming community. These are some of the many features that the platform has to offer.

Keywords: WSN (wireless sensor networks), M2M (M/C to M/C communication), automation, irrigation system, sustainability, SAAS (software as a service), soil moisture sensor

Procedia PDF Downloads 131
585 Anatomical Investigation of Superficial Fascia Relationships with the Skin and Underlying Tissue in the Greyhound Rump, Thigh, and Crus

Authors: Oday A. Al-Juhaishi, Sa`ad M. Ismail, Hung-Hsun Yen, Christina M. Murray, Helen M. S. Davies

Abstract:

The functional anatomy of the fascia in the greyhound is still poorly understood, and incompletely described. The basic knowledge of fascia stems mainly from anatomical, histological and ultrastructural analyses. In this study, twelve specimens of hindlimbs from six fresh greyhound cadavers (3 male, 3 female) were used to examine the topographical relationships of the superficial fascia with the skin and underlying tissue. The first incision was made along the dorsal midline from the level of the thoracolumbar junction caudally to the level of the mid sacrum. The second incision was begun at the level of the first incision and extended along the midline of the lateral aspect of the hindlimb distally, to just proximal to the tarsus, and, the skin margins carefully separated to observe connective tissue links between the skin and superficial fascia, attachment points of the fascia and the relationships of the fascia with blood vessels that supply the skin. A digital camera was used to record the anatomical features as they were revealed. The dissections identified fibrous septa connecting the skin with the superficial fascia and deep fascia in specific areas. The presence of the adipose tissue was found to be very rare within the superficial fascia in these specimens. On the extensor aspects of some joints, a fusion between the superficial fascia and deep fascia was observed. This fusion created a subcutaneous bursa in the following areas: a prepatellar bursa of the stifle, a tarsal bursa caudal to the calcaneus bone, and an ischiatic bursa caudal to the ischiatic tuberosity. The evaluation of blood vessels showed that the perforating vessels passed through fibrous septa in a perpendicular direction to supply the skin, with the largest branch noted in the gluteal area. The attachment points between the superficial fascia and skin were mainly found in the region of the flexor aspect of the joints, such as caudal to the stifle joint. The numerous fibrous septa between the superficial fascia and skin that have been identified in some areas, may create support for the blood vessels that penetrate fascia and into the skin, while allowing for movement between the tissue planes. The subcutaneous bursae between the skin and the superficial fascia where it is fused with the deep fascia may be useful to decrease friction between moving areas. The adhesion points may be related to the integrity and loading of the skin. The attachment points fix the skin and appear to divide the hindlimb into anatomical compartments.

Keywords: attachment points, fibrous septa, greyhound, subcutaneous bursa, superficial fascia

Procedia PDF Downloads 359
584 Comparison of Different Methods of Microorganism's Identification from a Copper Mining in Pará, Brazil

Authors: Louise H. Gracioso, Marcela P.G. Baltazar, Ingrid R. Avanzi, Bruno Karolski, Luciana J. Gimenes, Claudio O. Nascimento, Elen A. Perpetuo

Abstract:

Introduction: Higher copper concentrations promote a selection pressure on organisms such as plants, fungi and bacteria, which allows surviving only the resistant organisms to the contaminated site. This selective pressure keeps only the organisms most resistant to a specific condition and subsequently increases their bioremediation potential. Despite the bacteria importance for biosphere maintenance, it is estimated that only a small fraction living microbial species has been described and characterized. Due to the molecular biology development, tools based on analysis 16S ribosomal RNA or another specific gene are making a new scenario for the characterization studies and identification of microorganisms in the environment. News identification of microorganisms methods have also emerged like Biotyper (MALDI / TOF), this method mass spectrometry is subject to the recognition of spectroscopic patterns of conserved and features proteins for different microbial species. In view of this, this study aimed to isolate bacteria resistant to copper present in a Copper Processing Area (Sossego Mine, Canaan, PA) and identifies them in two different methods: Recent (spectrometry mass) and conventional. This work aimed to use them for a future bioremediation of this Mining. Material and Methods: Samples were collected at fifteen different sites of five periods of times. Microorganisms were isolated from mining wastes by culture enrichment technique; this procedure was repeated 4 times. The isolates were inoculated into MJS medium containing different concentrations of chloride copper (1mM, 2.5mM, 5mM, 7.5mM and 10 mM) and incubated in plates for 72 h at 28 ºC. These isolates were subjected to mass spectrometry identification methods (Biotyper – MALDI/TOF) and 16S gene sequencing. Results: A total of 105 strains were isolated in this area, bacterial identification by mass spectrometry method (MALDI/TOF) achieved 74% agreement with the conventional identification method (16S), 31% have been unsuccessful in MALDI-TOF and 2% did not obtain identification sequence the 16S. These results show that Biotyper can be a very useful tool in the identification of bacteria isolated from environmental samples, since it has a better value for money (cheap and simple sample preparation and MALDI plates are reusable). Furthermore, this technique is more rentable because it saves time and has a high performance (the mass spectra are compared to the database and it takes less than 2 minutes per sample).

Keywords: copper mining area, bioremediation, microorganisms, identification, MALDI/TOF, RNA 16S

Procedia PDF Downloads 378
583 CFD Modeling of Stripper Ash Cooler of Circulating Fluidized Bed

Authors: Ravi Inder Singh

Abstract:

Due to high heat transfer rate, high carbon utilizing efficiency, fuel flexibilities and other advantages numerous circulating fluidized bed boilers have grown up in India in last decade. Many companies like BHEL, ISGEC, Thermax, Cethar Limited, Enmas GB Power Systems Projects Limited are making CFBC and installing the units throughout the India. Due to complexity many problems exists in CFBC units and only few have been reported. Agglomeration i.e clinker formation in riser, loop seal leg and stripper ash coolers is one of problem industry is facing. Proper documentation is rarely found in the literature. Circulating fluidized bed (CFB) boiler bottom ash contains large amounts of physical heat. While the boiler combusts the low-calorie fuel, the ash content is normally more than 40% and the physical heat loss is approximately 3% if the bottom ash is discharged without cooling. In addition, the red-hot bottom ash is bad for mechanized handling and transportation, as the upper limit temperature of the ash handling machinery is 200 °C. Therefore, a bottom ash cooler (BAC) is often used to treat the high temperature bottom ash to reclaim heat, and to have the ash easily handled and transported. As a key auxiliary device of CFB boilers, the BAC has a direct influence on the secure and economic operation of the boiler. There are many kinds of BACs equipped for large-scale CFB boilers with the continuous development and improvement of the CFB boiler. These ash coolers are water cooled ash cooling screw, rolling-cylinder ash cooler (RAC), fluidized bed ash cooler (FBAC).In this study prototype of a novel stripper ash cooler is studied. The Circulating Fluidized bed Ash Coolers (CFBAC) combined the major technical features of spouted bed and bubbling bed, and could achieve the selective discharge on the bottom ash. The novel stripper ash cooler is bubbling bed and it is visible cold test rig. The reason for choosing cold test is that high temperature is difficult to maintain and create in laboratory level. The aim of study to know the flow pattern inside the stripper ash cooler. The cold rig prototype is similar to stripper ash cooler used industry and it was made after scaling down to some parameter. The performance of a fluidized bed ash cooler is studied using a cold experiment bench. The air flow rate, particle size of the solids and air distributor type are considered to be the key parameters of the operation of a fluidized bed ash cooler (FBAC) are studied in this.

Keywords: CFD, Eulerian-Eulerian, Eulerian-Lagraingian model, parallel simulations

Procedia PDF Downloads 512
582 Use of a Novel Intermittent Compression Shoe in Reducing Lower Limb Venous Stasis

Authors: Hansraj Riteesh Bookun, Cassandra Monique Hidajat

Abstract:

This pilot study investigated the efficacy of a newly designed shoe which will act as an intermittent pneumatic compression device to augment venous flow in the lower limb. The aim was to assess the degree with which a wearable intermittent compression device can increase the venous flow in the popliteal vein. Background: Deep venous thrombosis and chronic venous insufficiency are relatively common problems with significant morbidity and mortality. While mechanical and chemical thromboprophylaxis measures are in place in hospital environments (in the form of TED stockings, intermittent pneumatic compression devices, analgesia, antiplatelet and anticoagulant agents), there are limited options in a community setting. Additionally, many individuals are poorly tolerant of graduated compression stockings due to the difficulty in putting them on, their constant tightness and increased associated discomfort in warm weather. These factors may hinder the management of their chronic venous insufficiency. Method: The device is lightweight, easy to wear and comfortable, with a self-contained power source. It features a Bluetooth transmitter and can be controlled with a smartphone. It is externally almost indistinguishable from a normal shoe. During activation, two bladders are inflated -one overlying the metatarsal heads and the second at the pedal arch. The resulting cyclical increase in pressure squeezes blood into the deep venous system. This will decrease periods of stasis and potentially reduce the risk of deep venous thrombosis. The shoe was fitted to 2 healthy participants and the peak systolic velocity of flow in the popliteal vein was measured during and prior to intermittent compression phases. Assessments of total flow volume were also performed. All haemodynamic assessments were performed with ultrasound by a licensed sonographer. Results: Mean peak systolic velocity of 3.5 cm/s with standard deviation of 1.3 cm/s were obtained. There was a three fold increase in mean peak systolic velocity and five fold increase in total flow volume. Conclusion: The device augments venous flow in the leg significantly. This may contribute to lowered thromboembolic risk during periods of prolonged travel or immobility. This device may also serve as an adjunct in the treatment of chronic venous insufficiency. The study will be replicated on a larger scale in a multi—centre trial.

Keywords: venous, intermittent compression, shoe, wearable device

Procedia PDF Downloads 195