Search results for: target firm
2740 Quantifying the Protein-Protein Interaction between the Ion-Channel-Forming Colicin A and the Tol Proteins by Potassium Efflux in E. coli Cells
Authors: Fadilah Aleanizy
Abstract:
Colicins are a family of bacterial toxins that kill Escherichia coli and other closely related species. The mode of action of colicins involves binding to an outer membrane receptor and translocation across the cell envelope, leading to cytotoxicity through specific targets. The mechanism of colicin cytotoxicity includes a non-specific endonuclease activity or depolarization of the cytoplasmic membrane by pore-forming activity. For Group A colicins, translocation requires an interaction between the N-terminal domain of the colicin and a series of membrane- bound and periplasmic proteins known as the Tol system (TolB, TolR, TolA, TolQ, and Pal and the active domain must be translocated through the outer membranes. Protein-protein interactions are intrinsic to virtually every cellular process. The transient protein-protein interactions of the colicin include the interaction with much more complicated assemblies during colicin translocation across the cellular membrane to its target. The potassium release assay detects variation in the K+ content of bacterial cells (K+in). This assays is used to measure the effect of pore-forming colicins such as ColA on an indicator organism by measuring the changes of the K+ concentration in the external medium (K+out ) that are caused by cell killing with a K+ selective electrode. One of the goals of this work is to employ a quantifiable in-vivo method to spot which Tol protein are more implicated in the interaction with colicin A as it is translocated to its target.Keywords: K+ efflux, Colicin A, Tol-proteins, E. coli
Procedia PDF Downloads 4102739 A 'Systematic Literature Review' of Specific Types of Inventory Faced by the Management of Firms
Authors: Rui Brito
Abstract:
This contribution regards a literature review of inventory management that is a relevant topic for the firms, due to its important use of capital with implications in firm’s profitability within the complexity of a more competitive and globalized world. Firms look for small inventories in order to reduce holding costs, namely opportunity cost, warehousing and handling costs, deterioration and being out of style, but larger inventories are required by some reasons, such as customer service, ordering cost, transportation cost, supplier’s payment to reduce unit costs or to take advantage of price increase in the near future, and equipment setup cost. Thus, management shall address a trade-off between small inventories and larger inventories. This literature review concerns three types of inventory (spare parts, safety stock, and vendor) whose management usually is beyond the scope of logistics. The applied methodology consisted of an online search of databases regarding scientific documents in English, namely Elsevier, Springer, Emerald, Wiley, and Taylor & Francis, but excluding books except if edited, using search engines, such as Google Scholar and B-on. The search was based on three keywords/strings (themes) which had to be included just as in the article title, suggesting themes were very relevant to the researchers. The whole search period was between 2009 and 2018 with the aim of collecting between twenty and forty studies considered relevant within each of the key words/strings specified. Documents were sorted by relevance and to prevent the exclusion of the more recent articles, based on lower quantity of citations partially due to less time to be cited in new research articles, the search period was divided into two sub-periods (2009-2015 and 2016-2018). The number of surveyed articles by theme showed a variation from 40 to 200 and the number of citations of those articles showed a wider variation from 3 to 216. Selected articles from the three themes were analyzed and the first seven of the first sub-period and the first three of the second sub-period with more citations were read in full to make a synopsis of each article. Overall, the findings show that the majority of article types were models, namely mathematical, although with different sub-types for each theme. Almost all articles suggest further studies, with some mentioning it for their own author(s), which widen the diversity of the previous research. Identified research gaps concern the use of surveys to know which are the models more used by firms, the reasons for not using the models with more performance and accuracy, and which are the satisfaction levels with the outcomes of the inventories management and its effect on the improvement of the firm’s overall performance. The review ends with the limitations and contributions of the study.Keywords: inventory management, safety stock, spare parts inventory, vendor managed inventory
Procedia PDF Downloads 962738 Evaluate the Influence of Culture on the Choice of Capital Structure Management Companies
Authors: Sahar Jami, Iman Valizadeh
Abstract:
The purpose of the study: The aim of this study was to evaluate the influence of culture on the choice of capital structure management companies are listed in the Tehran Stock Exchange. Methods: This study was a cross-document using data after the event (Retrospective) in 1394 was performed. To select a sample of elimination sampling (screening) is used to determine the sample size was 123 companies. Results: The results showed that the variables of culture, return on equity, a significant positive impact on the capital structure (ROA, QTobins) and financial leverage and firm size variables and a significant negative impact on the capital structure (ROA, QTobins).Keywords: culture management, capital structure, ROA, QTobins, variables of culture
Procedia PDF Downloads 4672737 Total Productive Maintenance (TPM) as a Strategy for Competitiveness
Authors: Ignatio Madanhire, Charles Mbohwa
Abstract:
This research examines the effect of a human resource strategy and the overall equipment effectiveness as well as assessing how the combination of the two can increase a firm’s productivity. The human resource aspect is looked at in detail to assess motivation of operators through training to reduce wastage on the manufacturing shop floor. The waste was attributed to operators, maintenance personal, idle machines, idle manpower and break downs. This work seeks to investigate the concept of Total Productive Maintenance (TPM) in addressing these short comings in the manufacturing case study. The impact of TPM to increase production while, as well as increasing employee morale and job satisfaction is assessed. This can be resource material for practitioners who seek to improve overall equipment efficiency (OEE) to achieve higher level productivity and competitiveness.Keywords: maintenance, TPM, efficiency, productivity, strategy
Procedia PDF Downloads 4202736 Dividend Initiations and IPO Long-Run Performance
Authors: Nithi Sermsiriviboon, Somchai Supattarakul
Abstract:
Dividend initiations are an economically significant event that has important implications for a firm’s future financial capacity. Given that the market’s expectation of a consistent payout, managers of IPO firms must approach the initial dividend decision cautiously. We compare the long run performance of IPO firms that initiated dividends with those of similarly matched non-payers. We found that firms which initiated dividends perform significantly better up to three years after the initiation date. Moreover, we measure investor reactions by 2-day around dividend announcement date cumulative abnormal return. We evidence no statistically significant differences between cumulative abnormal returns (CAR) of IPO firms and cumulative abnormal returns of Non-IPO firms, indicating that investors do not respond to dividend announcement of IPO firms more than they do to the dividend announcement of Non-IPO firms.Keywords: dividend, initial public offerings, long-run performance, finance
Procedia PDF Downloads 2362735 The Processing of Context-Dependent and Context-Independent Scalar Implicatures
Authors: Liu Jia’nan
Abstract:
The default accounts hold the view that there exists a kind of scalar implicature which can be processed without context and own a psychological privilege over other scalar implicatures which depend on context. In contrast, the Relevance Theorist regards context as a must because all the scalar implicatures have to meet the need of relevance in discourse. However, in Katsos, the experimental results showed: Although quantitatively the adults rejected under-informative utterance with lexical scales (context-independent) and the ad hoc scales (context-dependent) at almost the same rate, adults still regarded the violation of utterance with lexical scales much more severe than with ad hoc scales. Neither default account nor Relevance Theory can fully explain this result. Thus, there are two questionable points to this result: (1) Is it possible that the strange discrepancy is due to other factors instead of the generation of scalar implicature? (2) Are the ad hoc scales truly formed under the possible influence from mental context? Do the participants generate scalar implicatures with ad hoc scales instead of just comparing semantic difference among target objects in the under- informative utterance? In my Experiment 1, the question (1) will be answered by repetition of Experiment 1 by Katsos. Test materials will be showed by PowerPoint in the form of pictures, and each procedure will be done under the guidance of a tester in a quiet room. Our Experiment 2 is intended to answer question (2). The test material of picture will be transformed into the literal words in DMDX and the target sentence will be showed word-by-word to participants in the soundproof room in our lab. Reading time of target parts, i.e. words containing scalar implicatures, will be recorded. We presume that in the group with lexical scale, standardized pragmatically mental context would help generate scalar implicature once the scalar word occurs, which will make the participants hope the upcoming words to be informative. Thus if the new input after scalar word is under-informative, more time will be cost for the extra semantic processing. However, in the group with ad hoc scale, scalar implicature may hardly be generated without the support from fixed mental context of scale. Thus, whether the new input is informative or not does not matter at all, and the reading time of target parts will be the same in informative and under-informative utterances. People’s mind may be a dynamic system, in which lots of factors would co-occur. If Katsos’ experimental result is reliable, will it shed light on the interplay of default accounts and context factors in scalar implicature processing? We might be able to assume, based on our experiments, that one single dominant processing paradigm may not be plausible. Furthermore, in the processing of scalar implicature, the semantic interpretation and the pragmatic interpretation may be made in a dynamic interplay in the mind. As to the lexical scale, the pragmatic reading may prevail over the semantic reading because of its greater exposure in daily language use, which may also lead the possible default or standardized paradigm override the role of context. However, those objects in ad hoc scale are not usually treated as scalar membership in mental context, and thus lexical-semantic association of the objects may prevent their pragmatic reading from generating scalar implicature. Only when the sufficient contextual factors are highlighted, can the pragmatic reading get privilege and generate scalar implicature.Keywords: scalar implicature, ad hoc scale, dynamic interplay, default account, Mandarin Chinese processing
Procedia PDF Downloads 3232734 Application of Corporate Social Responsibility in Small Manufacturing Enterprises
Authors: Winai Rungrittidetch
Abstract:
This paper investigated the operational system, procedures, outcomes, and obstacles during the application of the Corporate Social Responsibility by the small enterprises and other involved groups in the anchor production business of the core firm, Jatura Charoen Chai Company Limited. The paper also aimed to discover ways to improve the stakeholders who participated in the CSR training and advisory programme. The paper utilized the qualitative methodology which included documentary review and semi- structured interview. The interviews were made with 8 respondents as the representative of different groups of the company’s stakeholder. The findings drew out the lessons learned from the participation of the selected small manufacturing enterprises in the CSR training and advisory programme. Some suggestions were also made, addressing the significance of the Philosophy of Sufficiency Economy.Keywords: corporate, social, responsibility, enterprises
Procedia PDF Downloads 3492733 Readability Facing the Irreducible Otherness: Translation as a Third Dimension toward a Multilingual Higher Education
Authors: Noury Bakrim
Abstract:
From the point of view of language morphodynamics, interpretative Readability of the text-result (the stasis) is not the external hermeneutics of its various potential reading events but the paradigmatic, semantic immanence of its dynamics. In other words, interpretative Readability articulates the potential tension between projection (intentionality of the discursive event) and the result (Readability within the syntagmatic stasis). We then consider that translation represents much more a metalinguistic conversion of neurocognitive bilingual sub-routines and modular relations than a semantic equivalence. Furthermore, the actualizing Readability (the process of rewriting a target text within a target language/genre) builds upon the descriptive level between the generative syntax/semantic from and its paradigmatic potential translatability. Translation corpora reveal the evidence of a certain focusing on the positivist stasis of the source text at the expense of its interpretative Readability. For instance, Fluchere's brilliant translation of Miller's Tropic of cancer into French realizes unconsciously an inversion of the hierarchical relations between Life Thought and Fable: From Life Thought (fable) into Fable (Life Thought). We could regard the translation of Bernard Kreiss basing on Canetti's work die englischen Jahre (les annees anglaises) as another inversion of the historical scale from individual history into Hegelian history. In order to describe and test both translation process and result, we focus on the pedagogical practice which enables various principles grounding in interpretative/actualizing Readability. Henceforth, establishing the analytical uttering dynamics of the source text could be widened by other practices. The reversibility test (target - source text) or the comparison with a second translation in a third language (tertium comparationis A/B and A/C) point out the evidence of an impossible event. Therefore, it doesn't imply an uttering idealistic/absolute source but the irreducible/non-reproducible intentionality of its production event within the experience of world/discourse. The aim of this paper is to conceptualize translation as the tension between interpretative and actualizing Readability in a new approach grounding in morphodynamics of language and Translatability (mainly into French) within literary and non-literary texts articulating theoretical and described pedagogical corpora.Keywords: readability, translation as deverbalization, translation as conversion, Tertium Comparationis, uttering actualization, translation pedagogy
Procedia PDF Downloads 1662732 Using Authentic and Instructional Materials to Support Intercultural Communicative Competence in ELT
Authors: Jana Beresova
Abstract:
The paper presents a study carried out in 2015-2016 within the national scheme of research - VEGA 1/0106/15 based on theoretical research and empirical verification of the concept of intercultural communicative competence. It focuses on the current conception concerning target languages teaching compatible with the Common European Framework of Reference for Languages: Learning, teaching, assessment. Our research had revealed how the concept of intercultural communicative competence had been perceived by secondary-school teachers of English in Slovakia before they were intensively trained. Intensive workshops were based on the use of both authentic and instructional materials with the goal to support interculturally oriented language teaching aimed at challenging thinking. The former concept that supported the development of the students´ linguistic knowledge and the use of a target language to obtain information about the culture of the country whose language learners were learning was expanded by the meaning-making framework which views language as a typical means by which culture is mediated. The goal of the workshop was to influence English teachers to better understand the concept of intercultural communicative competence, combining theory and practice optimally. The results of the study will be presented and analysed, providing particular recommendations for language teachers and suggesting some changes in the National Educational Programme from which English learners should benefit in their future studies or professional careers.Keywords: authentic materials, English language teaching, instructional materials, intercultural communicative competence
Procedia PDF Downloads 2702731 Triple Modulation on Wound Healing in Glaucoma Surgery Using Mitomycin C and Ologen Augmented with Anti-Vascular Endothelial Growth Factor
Authors: Reetika Sharma, Lalit Tejwani, Himanshu Shekhar, Arun Singhvi
Abstract:
Purpose: To describe a novel technique of trabeculectomy targeting triple modulation on wound healing to increase the overall success rate. Method: Ten eyes of 10 patients underwent trabeculectomy with subconjunctival mitomycin C (0.4mg/ml for 4 minutes) application combined with Ologen implantation subconjunctivally and subsclerally. Five of these patients underwent additional phacoemulsification with intraocular lens implantation. The Ologen implant was wet with 0.1 ml Bevacizumab. Result: All the eyes achieved target intraocular pressure (IOP), which was maintained until one year of follow-up. Two patients needed anterior chamber reformation at day two post surgery. One patient needed cataract surgery after four months of surgery and achieved target intraocular pressure on two topical antiglaucoma medicines. Conclusion: Vascular endothelial growth factor (VEGF) concentration has been seen to increase in the aqueous humor after filtration surgery. Ologen implantation helps in collagen remodelling, antifibroblastic response, and acts as a spacer. Bevacizumab augmented Ologen, in addition, targets the increased VEGF and helps in decreasing scarring. Anti-VEGF augmented Ologen in trabeculectomy with mitomycin C (MMC) hence appears to have encouraging short-term intraocular pressure control.Keywords: ologen, anti-VEGF, trabeculectomy, scarring
Procedia PDF Downloads 1882730 A pH-Activatable Nanoparticle Self-Assembly Triggered by 7-Amino Actinomycin D Demonstrating Superior Tumor Fluorescence Imaging and Anticancer Performance
Authors: Han Xiao
Abstract:
The development of nanomedicines has recently achieved several breakthroughs in the field of cancer treatment; however, the biocompatibility and targeted burst release of these medications remain a limitation, which leads to serious side effects and significantly narrows the scope of their applications. The self-assembly of intermediate filament protein (IFP) peptides was triggered by a hydrophobic cation drug 7-amino actinomycin D (7-AAD) to synthesize pH-activatable nanoparticles (NPs) that could simultaneously locate tumors and produce antitumor effects. The designed IFP peptide included a target peptide (arginine–glycine–aspartate), a negatively charged region, and an α-helix sequence. It also possessed the ability to encapsulate 7-AAD molecules through the formation of hydrogen bonds and hydrophobic interactions by a one-step method. 7-AAD molecules with excellent near-infrared fluorescence properties could be target delivered into tumor cells by NPs and released immediately in the acidic environments of tumors and endosome/lysosomes, ultimately inducing cytotoxicity by arresting the tumor cell cycle with inserted DNA. It is noteworthy that the IFP/7-AAD NPs tail vein injection approach demonstrated not only high tumor-targeted imaging potential, but also strong antitumor therapeutic effects in vivo. The proposed strategy may be used in the delivery of cationic antitumor drugs for precise imaging and cancer therapy.Keywords: 7-amino actinomycin D, intermediate filament protein, nanoparticle, tumor image
Procedia PDF Downloads 1382729 Implementing Activity-Based Costing in Architectural Aluminum Projects: Case Study and Lessons Learned
Authors: Amer Momani, Tarek Al-Hawari, Abdallah Alakayleh
Abstract:
This study explains how to construct an actionable activity-based costing and management system to accurately track and account the total costs of architectural aluminum projects. Two ABC models were proposed to accomplish this purpose. First, the learning and development model was introduced to examine how to apply an ABC model in an architectural aluminum firm for the first time and to be familiar with ABC concepts. Second, an actual ABC model was built on the basis of the results of the previous model to accurately trace the actual costs incurred on each project in a year, and to be able to provide a quote with the best trade-off between competitiveness and profitability. The validity of the proposed model was verified on a local architectural aluminum company.Keywords: activity-based costing, activity-based management, construction, architectural aluminum
Procedia PDF Downloads 1022728 A Large Language Model-Driven Method for Automated Building Energy Model Generation
Authors: Yake Zhang, Peng Xu
Abstract:
The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.Keywords: artificial intelligence, building energy modelling, building simulation, large language model
Procedia PDF Downloads 262727 Life Stage Customer Segmentation by Fine-Tuning Large Language Models
Authors: Nikita Katyal, Shaurya Uppal
Abstract:
This paper tackles the significant challenge of accurately classifying customers within a retailer’s customer base. Accurate classification is essential for developing targeted marketing strategies that effectively engage this important demographic. To address this issue, we propose a method that utilizes Large Language Models (LLMs). By employing LLMs, we analyze the metadata associated with product purchases derived from historical data to identify key product categories that act as distinguishing factors. These categories, such as baby food, eldercare products, or family-sized packages, offer valuable insights into the likely household composition of customers, including families with babies, families with kids/teenagers, families with pets, households caring for elders, or mixed households. We segment high-confidence customers into distinct categories by integrating historical purchase behavior with LLM-powered product classification. This paper asserts that life stage segmentation can significantly enhance e-commerce businesses’ ability to target the appropriate customers with tailored products and campaigns, thereby augmenting sales and improving customer retention. Additionally, the paper details the data sources, model architecture, and evaluation metrics employed for the segmentation task.Keywords: LLMs, segmentation, product tags, fine-tuning, target segments, marketing communication
Procedia PDF Downloads 232726 Improving the Management Systems of the Ownership Risks in Conditions of Transformation of the Russian Economy
Authors: Mikhail V. Khachaturyan
Abstract:
The article analyzes problems of improving the management systems of the ownership risks in the conditions of the transformation of the Russian economy. Among the main sources of threats business owners should highlight is the inefficiency of the implementation of business models and interaction with hired managers. In this context, it is particularly important to analyze the relationship of business models and ownership risks. The analysis of this problem appears to be relevant for a number of reasons: Firstly, the increased risk appetite of the owner directly affects the business model and the composition of his holdings; secondly, owners with significant stakes in the company are factors in the formation of particular types of risks for owners, for which relations have a significant influence on a firm's competitiveness and ultimately determines its survival; and thirdly, inefficient system of management ownership of risk is one of the main causes of mass bankruptcies, which significantly affects the stable operation of the economy as a whole. The separation of the processes of possession, disposal and use in modern organizations is the cause of not only problems in the process of interaction between the owner and managers in managing the organization as a whole, but also the asymmetric information about the kinds and forms of the main risks. Managers tend to avoid risky projects, inhibit the diversification of the organization's assets, while owners can insist on the development of such projects, with the aim not only of creating new values for themselves and consumers, but also increasing the value of the company as a result of increasing capital. In terms of separating ownership and management, evaluation of projects by the ratio of risk-yield requires preservation of the influence of the owner on the process of development and making management decisions. It is obvious that without a clearly structured system of participation of the owner in managing the risks of their business, further development is hopeless. In modern conditions of forming a risk management system, owners are compelled to compromise between the desire to increase the organization's ability to produce new value, and, consequently, increase its cost due to the implementation of risky projects and the need to tolerate the cost of lost opportunities of risk diversification. Improving the effectiveness of the management of ownership risks may also contribute to the revitalization of creditors on implementation claims to inefficient owners, which ultimately will contribute to the efficiency models of ownership control to exclude variants of insolvency. It is obvious that in modern conditions, the success of the model of the ownership of risk management and audit is largely determined by the ability and willingness of the owner to find a compromise between potential opportunities for expanding the firm's ability to create new value through risk and maintaining the current level of new value creation and an acceptable level of risk through the use of models of diversification.Keywords: improving, ownership risks, problem, Russia
Procedia PDF Downloads 3502725 A Computational Investigation of Potential Drugs for Cholesterol Regulation to Treat Alzheimer’s Disease
Authors: Marina Passero, Tianhua Zhai, Zuyi (Jacky) Huang
Abstract:
Alzheimer’s disease has become a major public health issue, as indicated by the increasing populations of Americans living with Alzheimer’s disease. After decades of extensive research in Alzheimer’s disease, only seven drugs have been approved by Food and Drug Administration (FDA) to treat Alzheimer’s disease. Five of these drugs were designed to treat the dementia symptoms, and only two drugs (i.e., Aducanumab and Lecanemab) target the progression of Alzheimer’s disease, especially the accumulation of amyloid-b plaques. However, controversial comments were raised for the accelerated approvals of either Aducanumab or Lecanemab, especially with concerns on safety and side effects of these two drugs. There is still an urgent need for further drug discovery to target the biological processes involved in the progression of Alzheimer’s disease. Excessive cholesterol has been found to accumulate in the brain of those with Alzheimer’s disease. Cholesterol can be synthesized in both the blood and the brain, but the majority of biosynthesis in the adult brain takes place in astrocytes and is then transported to the neurons via ApoE. The blood brain barrier separates cholesterol metabolism in the brain from the rest of the body. Various proteins contribute to the metabolism of cholesterol in the brain, which offer potential targets for Alzheimer’s treatment. In the astrocytes, SREBP cleavage-activating protein (SCAP) binds to Sterol Regulatory Element-binding Protein 2 (SREBP2) in order to transport the complex from the endoplasmic reticulum to the Golgi apparatus. Cholesterol is secreted out of the astrocytes by ATP-Binding Cassette A1 (ABCA1) transporter. Lipoprotein receptors such as triggering receptor expressed on myeloid cells 2 (TREM2) internalize cholesterol into the microglia, while lipoprotein receptors such as Low-density lipoprotein receptor-related protein 1 (LRP1) internalize cholesterol into the neuron. Cytochrome P450 Family 46 Subfamily A Member 1 (CYP46A1) converts excess cholesterol to 24S-hydroxycholesterol (24S-OHC). Cholesterol has been approved for its direct effect on the production of amyloid-beta and tau proteins. The addition of cholesterol to the brain promotes the activity of beta-site amyloid precursor protein cleaving enzyme 1 (BACE1), secretase, and amyloid precursor protein (APP), which all aid in amyloid-beta production. The reduction of cholesterol esters in the brain have been found to reduce phosphorylated tau levels in mice. In this work, a computational pipeline was developed to identify the protein targets involved in cholesterol regulation in brain and further to identify chemical compounds as the inhibitors of a selected protein target. Since extensive evidence shows the strong correlation between brain cholesterol regulation and Alzheimer’s disease, a detailed literature review on genes or pathways related to the brain cholesterol synthesis and regulation was first conducted in this work. An interaction network was then built for those genes so that the top gene targets were identified. The involvement of these genes in Alzheimer’s disease progression was discussed, which was followed by the investigation of existing clinical trials for those targets. A ligand-protein docking program was finally developed to screen 1.5 million chemical compounds for the selected protein target. A machine learning program was developed to evaluate and predict the binding interaction between chemical compounds and the protein target. The results from this work pave the way for further drug discovery to regulate brain cholesterol to combat Alzheimer’s disease.Keywords: Alzheimer’s disease, drug discovery, ligand-protein docking, gene-network analysis, cholesterol regulation
Procedia PDF Downloads 752724 The Molecule Preserve Environment: Effects of Inhibitor of the Angiotensin Converting Enzyme on Reproductive Potential and Composition Contents of the Mediterranean Flour Moth, Ephestia kuehniella Zeller
Authors: Yezli-Touiker Samira, Amrani-Kirane Leila, Soltani Mazouni Nadia
Abstract:
Due to secondary effects of conventional insecticides on the environment, the agrochemical research has resulted in the discovery of novel molecules. That research work will help in the development of a new group of pesticides that may be cheaper and less hazardous to the environment and non-target organisms which is the main desired outcome of the present work. Angiotensin-converting enzyme as a target for the development of novel insect growth regulators. Captopril is an inhibitor of angiotensin converting enzyme (ACE) it was tested in vivo by topical application on reproduction of Ephestia kuehniella Zeller (Lepidoptera: Pyralidae). The compound is diluted in acetone and applied topically to newly emerged pupae (10µg/ 2µl). The effects of this molecule was studied,on the biochemistry of ovary (on amounts nucleic acid, proteins, the qualitative analysis of the ovarian proteins and the reproductive potential (duration of the pre-oviposition, duration of the oviposition, number of eggs laid and hatching percentage). Captopril reduces significantly quantity of ovarian proteins and nucleic acid. The electrophoresis profile reveals the absence of tree bands at the treated series. This molecule reduced the duration of the oviposition period, the fecundity and the eggviability.Keywords: environment, ephestia kuehniella, captopril, reproduction, the agrochemical research
Procedia PDF Downloads 2852723 Synthesis and Tribological Properties of the Al-Cr-N/MoS₂ Self-Lubricating Coatings by Hybrid Magnetron Sputtering
Authors: Tie-Gang Wang, De-Qiang Meng, Yan-Mei Liu
Abstract:
Ternary AlCrN coatings were widely used to prolong cutting tool life because of their high hardness and excellent abrasion resistance. However, the friction between the workpiece and cutter surface was increased remarkably during machining difficult-to-cut materials (such as superalloy, titanium, etc.). As a result, a lot of cutting heat was generated and cutting tool life was shortened. In this work, an appropriate amount of solid lubricant MoS₂ was added into the AlCrN coating to reduce the friction between the tool and the workpiece. A series of Al-Cr-N/MoS₂ self-lubricating coatings with different MoS₂ contents were prepared by high power impulse magnetron sputtering (HiPIMS) and pulsed direct current magnetron sputtering (Pulsed DC) compound system. The MoS₂ content in the coatings was changed by adjusting the sputtering power of the MoS₂ target. The composition, structure and mechanical properties of the Al-Cr-N/MoS2 coatings were systematically evaluated by energy dispersive spectrometer, scanning electron microscopy, X-ray photoelectron spectroscopy, X-ray diffractometer, nano-indenter tester, scratch tester, and ball-on-disk tribometer. The results indicated the lubricant content played an important role in the coating properties. As the sputtering power of the MoS₂ target was 0.1 kW, the coating possessed the highest hardness 14.1GPa, the highest critical load 44.8 N, and the lowest wear rate 4.4×10−3μm2/N.Keywords: self-lubricating coating, Al-Cr-N/MoS₂ coating, wear rate, friction coefficient
Procedia PDF Downloads 1322722 Ideology Shift in Political Translation
Authors: Jingsong Ma
Abstract:
In political translation, ideology plays an important role in conveying implications accurately. Ideological collisions can occur in political translation when there existdifferences of political environments embedded in the translingual political texts in both source and target languages. To reach an accurate translationrequires the translatorto understand the ideologies implied in (and often transcending) the texts. This paper explores the conditions, procedure, and purpose of processingideological collision and resolution of such issues in political translation. These points will be elucidated by case studies of translating English and Chinese political texts. First, there are specific political terminologies in certain political environments. These terminological peculiarities in one language are often determined by ideological elements rather than by syntactical and semantical understanding. The translation of these ideological-loaded terminologiesis a process and operation consisting of understanding the ideological context, including cultural, historical, and political situations. This will be explained with characteristic Chinese political terminologies and their renderings in English. Second, when the ideology in the source language fails to match with the ideology in the target language, the decisions to highlight or disregard these conflicts are shaped by power relations, political engagement, social context, etc. It thus is necessary to go beyond linguisticanalysis of the context by deciphering ideology in political documents to provide a faithful or equivalent rendering of certain messages. Finally, one of the practical issues is about equivalence in political translation by redefining the notion of faithfulness and retainment of ideological messages in the source language in translations of political texts. To avoid distortion, the translator should be liberated from grip the literal meaning, instead diving into functional meanings of the text.Keywords: translation, ideology, politics, society
Procedia PDF Downloads 1112721 Case Analysis of Bamboo Based Social Enterprises in India-Improving Profitability and Sustainability
Authors: Priyal Motwani
Abstract:
The current market for bamboo products in India is about Rs. 21000 crores and is highly unorganised and fragmented. In this study, we have closely analysed the structure and functions of a major bamboo craft based organisation in Kerela, India and elaborated about its value chain, product mix, pricing strategy and supply chain, collaborations and competitive landscape. We have identified six major bottlenecks that are prevalent in such organisations, based on the Indian context, in relation to their product mix, asset management, and supply chain- corresponding waste management and retail network. The study has identified that the target customers for the bamboo based products and alternative revenue streams (eco-tourism, microenterprises, training), by carrying out secondary and primary research (5000 sample space), that can boost the existing revenue by 150%. We have then recommended an optimum product mix-covering premium, medium and low valued processing, for medium sized bamboo based organisations, in accordance with their capacity to maximize their revenue potential. After studying such organisations and their counter parts, the study has established an optimum retail network, considering B2B, B2C physical and online retail, to maximize their sales to their target groups. On the basis of the results obtained from the analysis of the future and present trends, our study gives recommendations to improve the revenue potential of bamboo based organisation in India and promote sustainability.Keywords: bamboo, bottlenecks, optimization, product mix, retail network, value chain
Procedia PDF Downloads 2172720 Path Planning for Unmanned Aerial Vehicles in Constrained Environments for Locust Elimination
Authors: Aadiv Shah, Hari Nair, Vedant Mittal, Alice Cheeran
Abstract:
Present-day agricultural practices such as blanket spraying not only lead to excessive usage of pesticides but also harm the overall crop yield. This paper introduces an algorithm to optimize the traversal of an unmanned aerial vehicle (UAV) in constrained environments. The proposed system focuses on the agricultural application of targeted spraying for locust elimination. Given a satellite image of a farm, target zones that are prone to locust swarm formation are detected through the calculation of the normalized difference vegetation index (NDVI). This is followed by determining the optimal path for traversal of a UAV through these target zones using the proposed algorithm in order to perform pesticide spraying in the most efficient manner possible. Unlike the classic travelling salesman problem involving point-to-point optimization, the proposed algorithm determines an optimal path for multiple regions, independent of its geometry. Finally, the paper explores the idea of implementing reinforcement learning to model complex environmental behaviour and make the path planning mechanism for UAVs agnostic to external environment changes. This system not only presents a solution to the enormous losses incurred due to locust attacks but also an efficient way to automate agricultural practices across the globe in order to improve farmer ergonomics.Keywords: locust, NDVI, optimization, path planning, reinforcement learning, UAV
Procedia PDF Downloads 2512719 Immersive and Non-Immersive Virtual Reality Applied to the Cervical Spine Assessment
Authors: Pawel Kiper, Alfonc Baba, Mahmoud Alhelou, Giorgia Pregnolato, Michela Agostini, Andrea Turolla
Abstract:
Impairment of cervical spine mobility is often related to pain triggered by musculoskeletal disorders or direct traumatic injuries of the spine. To date, these disorders are assessed with goniometers and inclinometers, which are the most popular devices used in clinical settings. Nevertheless, these technologies usually allow measurement of no more than two-dimensional range of motion (ROM) quotes in static conditions. Conversely, the wide use of motion tracking systems able to measure 3 to 6 degrees of freedom dynamically, while performing standard ROM assessment, are limited due to technical complexities in preparing the setup and high costs. Thus, motion tracking systems are primarily used in research. These systems are an integral part of virtual reality (VR) technologies, which can be used for measuring spine mobility. To our knowledge, the accuracy of VR measure has not yet been studied within virtual environments. Thus, the aim of this study was to test the reliability of a protocol for the assessment of sensorimotor function of the cervical spine in a population of healthy subjects and to compare whether using immersive or non-immersive VR for visualization affects the performance. Both VR assessments consisted of the same five exercises and random sequence determined which of the environments (i.e. immersive or non-immersive) was used as first assessment. Subjects were asked to perform head rotation (right and left), flexion, extension and lateral flexion (right and left side bending). Each movement was executed five times. Moreover, the participants were invited to perform head reaching movements i.e. head movements toward 8 targets placed along a circular perimeter each 45°, visualized one-by-one in random order. Finally, head repositioning movement was obtained by head movement toward the same 8 targets as for reaching and following reposition to the start point. Thus, each participant performed 46 tasks during assessment. Main measures were: ROM of rotation, flexion, extension, lateral flexion and complete kinematics of the cervical spine (i.e. number of completed targets, time of execution (seconds), spatial length (cm), angle distance (°), jerk). Thirty-five healthy participants (i.e. 14 males and 21 females, mean age 28.4±6.47) were recruited for the cervical spine assessment with immersive and non-immersive VR environments. Comparison analysis demonstrated that: head right rotation (p=0.027), extension (p=0.047), flexion (p=0.000), time (p=0.001), spatial length (p=0.004), jerk target (p=0.032), trajectory repositioning (p=0.003), and jerk target repositioning (p=0.007) were significantly better in immersive than non-immersive VR. A regression model showed that assessment in immersive VR was influenced by height, trajectory repositioning (p<0.05), and handedness (p<0.05), whereas in non-immersive VR performance was influenced by height, jerk target (p=0.002), head extension, jerk target repositioning (p=0.002), and by age, head flex/ext, trajectory repositioning, and weight (p=0.040). The results of this study showed higher accuracy of cervical spine assessment when executed in immersive VR. The assessment of ROM and kinematics of the cervical spine can be affected by independent and dependent variables in both immersive and non-immersive VR settings.Keywords: virtual reality, cervical spine, motion analysis, range of motion, measurement validity
Procedia PDF Downloads 1662718 An Inverse Docking Approach for Identifying New Potential Anticancer Targets
Authors: Soujanya Pasumarthi
Abstract:
Inverse docking is a relatively new technique that has been used to identify potential receptor targets of small molecules. Our docking software package MDock is well suited for such an application as it is both computationally efficient, yet simultaneously shows adequate results in binding affinity predictions and enrichment tests. As a validation study, we present the first stage results of an inverse-docking study which seeks to identify potential direct targets of PRIMA-1. PRIMA-1 is well known for its ability to restore mutant p53's tumor suppressor function, leading to apoptosis in several types of cancer cells. For this reason, we believe that potential direct targets of PRIMA-1 identified in silico should be experimentally screened for their ability to inhibitcancer cell growth. The highest-ranked human protein of our PRIMA-1 docking results is oxidosqualene cyclase (OSC), which is part of the cholesterol synthetic pathway. The results of two followup experiments which treat OSC as a possible anti-cancer target are promising. We show that both PRIMA-1 and Ro 48-8071, a known potent OSC inhibitor, significantly reduce theviability of BT-474 breast cancer cells relative to normal mammary cells. In addition, like PRIMA-1, we find that Ro 48-8071 results in increased binding of mutant p53 to DNA in BT- 474cells (which highly express p53). For the first time, Ro 48-8071 is shown as a potent agent in killing human breast cancer cells. The potential of OSC as a new target for developing anticancer therapies is worth further investigation.Keywords: inverse docking, in silico screening, protein-ligand interactions, molecular docking
Procedia PDF Downloads 4462717 Unified Coordinate System Approach for Swarm Search Algorithms in Global Information Deficit Environments
Authors: Rohit Dey, Sailendra Karra
Abstract:
This paper aims at solving the problem of multi-target searching in a Global Positioning System (GPS) denied environment using swarm robots with limited sensing and communication abilities. Typically, existing swarm-based search algorithms rely on the presence of a global coordinate system (vis-à-vis, GPS) that is shared by the entire swarm which, in turn, limits its application in a real-world scenario. This can be attributed to the fact that robots in a swarm need to share information among themselves regarding their location and signal from targets to decide their future course of action but this information is only meaningful when they all share the same coordinate frame. The paper addresses this very issue by eliminating any dependency of a search algorithm on the need of a predetermined global coordinate frame by the unification of the relative coordinate of individual robots when within the communication range, therefore, making the system more robust in real scenarios. Our algorithm assumes that all the robots in the swarm are equipped with range and bearing sensors and have limited sensing range and communication abilities. Initially, every robot maintains their relative coordinate frame and follow Levy walk random exploration until they come in range with other robots. When two or more robots are within communication range, they share sensor information and their location w.r.t. their coordinate frames based on which we unify their coordinate frames. Now they can share information about the areas that were already explored, information about the surroundings, and target signal from their location to make decisions about their future movement based on the search algorithm. During the process of exploration, there can be several small groups of robots having their own coordinate systems but eventually, it is expected for all the robots to be under one global coordinate frame where they can communicate information on the exploration area following swarm search techniques. Using the proposed method, swarm-based search algorithms can work in a real-world scenario without GPS and any initial information about the size and shape of the environment. Initial simulation results show that running our modified-Particle Swarm Optimization (PSO) without global information we can still achieve the desired results that are comparable to basic PSO working with GPS. In the full paper, we plan on doing the comparison study between different strategies to unify the coordinate system and to implement them on other bio-inspired algorithms, to work in GPS denied environment.Keywords: bio-inspired search algorithms, decentralized control, GPS denied environment, swarm robotics, target searching, unifying coordinate systems
Procedia PDF Downloads 1372716 Rapid, Label-Free, Direct Detection and Quantification of Escherichia coli Bacteria Using Nonlinear Acoustic Aptasensor
Authors: Shilpa Khobragade, Carlos Da Silva Granja, Niklas Sandström, Igor Efimov, Victor P. Ostanin, Wouter van der Wijngaart, David Klenerman, Sourav K. Ghosh
Abstract:
Rapid, label-free and direct detection of pathogenic bacteria is critical for the prevention of disease outbreaks. This paper for the first time attempts to probe the nonlinear acoustic response of quartz crystal resonator (QCR) functionalized with specific DNA aptamers for direct detection and quantification of viable E. coli KCTC 2571 bacteria. DNA aptamers were immobilized through biotin and streptavidin conjugation, onto the gold surface of QCR to capture the target bacteria and the detection was accomplished by shift in amplitude of the peak 3f signal (3 times the drive frequency) upon binding, when driven near fundamental resonance frequency. The developed nonlinear acoustic aptasensor system demonstrated better reliability than conventional resonance frequency shift and energy dissipation monitoring that were recorded simultaneously. This sensing system could directly detect 10⁽⁵⁾ cells/mL target bacteria within 30 min or less and had high specificity towards E. coli KCTC 2571 bacteria as compared to the same concentration of S.typhi bacteria. Aptasensor response was observed for the bacterial suspensions ranging from 10⁽⁵⁾-10⁽⁸⁾ cells/mL. Conclusively, this nonlinear acoustic aptasensor is simple to use, gives real-time output, cost-effective and has the potential for rapid, specific, label-free direction detection of bacteria.Keywords: acoustic, aptasensor, detection, nonlinear
Procedia PDF Downloads 5672715 Classification of EEG Signals Based on Dynamic Connectivity Analysis
Authors: Zoran Šverko, Saša Vlahinić, Nino Stojković, Ivan Markovinović
Abstract:
In this article, the classification of target letters is performed using data from the EEG P300 Speller paradigm. Neural networks trained with the results of dynamic connectivity analysis between different brain regions are used for classification. Dynamic connectivity analysis is based on the adaptive window size and the imaginary part of the complex Pearson correlation coefficient. Brain dynamics are analysed using the relative intersection of confidence intervals for the imaginary component of the complex Pearson correlation coefficient method (RICI-imCPCC). The RICI-imCPCC method overcomes the shortcomings of currently used dynamical connectivity analysis methods, such as the low reliability and low temporal precision for short connectivity intervals encountered in constant sliding window analysis with wide window size and the high susceptibility to noise encountered in constant sliding window analysis with narrow window size. This method overcomes these shortcomings by dynamically adjusting the window size using the RICI rule. This method extracts information about brain connections for each time sample. Seventy percent of the extracted brain connectivity information is used for training and thirty percent for validation. Classification of the target word is also done and based on the same analysis method. As far as we know, through this research, we have shown for the first time that dynamic connectivity can be used as a parameter for classifying EEG signals.Keywords: dynamic connectivity analysis, EEG, neural networks, Pearson correlation coefficients
Procedia PDF Downloads 2142714 Bank Concentration and Industry Structure: Evidence from China
Authors: Jingjing Ye, Cijun Fan, Yan Dong
Abstract:
The development of financial sector plays an important role in shaping industrial structure. However, evidence on the micro-level channels through which this relation manifest remains relatively sparse, particularly for developing countries. In this paper, we compile an industry-by-city dataset based on manufacturing firms and registered banks in 287 Chinese cities from 1998 to 2008. Based on a difference-in-difference approach, we find the highly concentrated banking sector decreases the competitiveness of firms in each manufacturing industry. There are two main reasons: i) bank accessibility successfully fosters firm expansion within each industry, however, only for sufficiently large enterprises; ii) state-owned enterprises are favored by the banking industry in China. The results are robust after considering alternative concentration and external finance dependence measures.Keywords: bank concentration, China, difference-in-difference, industry structure
Procedia PDF Downloads 3882713 Indoor Air Pollution of the Flexographic Printing Environment
Authors: Jelena S. Kiurski, Vesna S. Kecić, Snežana M. Aksentijević
Abstract:
The identification and evaluation of organic and inorganic pollutants were performed in a flexographic facility in Novi Sad, Serbia. Air samples were collected and analyzed in situ, during 4-hours working time at five sampling points by the mobile gas chromatograph and ozonometer at the printing of collagen casing. Experimental results showed that the concentrations of isopropyl alcohol, acetone, total volatile organic compounds and ozone varied during the sampling times. The highest average concentrations of 94.80 ppm and 102.57 ppm were achieved at 200 minutes from starting the production for isopropyl alcohol and total volatile organic compounds, respectively. The mutual dependences between target hazardous and microclimate parameters were confirmed using a multiple linear regression model with software package STATISTICA 10. Obtained multiple coefficients of determination in the case of ozone and acetone (0.507 and 0.589) with microclimate parameters indicated a moderate correlation between the observed variables. However, a strong positive correlation was obtained for isopropyl alcohol and total volatile organic compounds (0.760 and 0.852) with microclimate parameters. Higher values of parameter F than Fcritical for all examined dependences indicated the existence of statistically significant difference between the concentration levels of target pollutants and microclimates parameters. Given that, the microclimate parameters significantly affect the emission of investigated gases and the application of eco-friendly materials in production process present a necessity.Keywords: flexographic printing, indoor air, multiple regression analysis, pollution emission
Procedia PDF Downloads 1972712 Working Capital Management and Profitability of Uk Firms: A Contingency Theory Approach
Authors: Ishmael Tingbani
Abstract:
This paper adopts a contingency theory approach to investigate the relationship between working capital management and profitability using data of 225 listed British firms on the London Stock Exchange for the period 2001-2011. The paper employs a panel data analysis on a series of interactive models to estimate this relationship. The findings of the study confirm the relevance of the contingency theory. Evidence from the study suggests that the impact of working capital management on profitability varies and is constrained by organizational contingencies (environment, resources, and management factors) of the firm. These findings have implications for a more balanced and nuanced view of working capital management policy for policy-makers.Keywords: working capital management, profitability, contingency theory approach, interactive models
Procedia PDF Downloads 3472711 The Views of German Preparatory Language Programme Students about German Speaking Activity
Authors: Eda Üstünel, Seval Karacabey
Abstract:
The students, who are enrolled in German Preparatory Language Programme at the School of Foreign Languages, Muğla Sıtkı Koçman University, Turkey, learn German as a foreign language for two semesters in an academic year. Although the language programme is a skills-based one, the students lack German speaking skills due to their fear of making language mistakes while speaking in German. This problem of incompetency in German speaking skills exists also in their four-year departmental study at the Faculty of Education. In order to address this problem we design German speaking activities, which are extra-curricular activities. With the help of these activities, we aim to lead Turkish students of German language to speak in the target language, to improve their speaking skills in the target language and to create a stress-free atmosphere and a meaningful learning environment to communicate in the target language. In order to achieve these aims, an ERASMUS+ exchange staff (a German trainee teacher of German as a foreign language), who is from Schwabisch Gmünd University, Germany, conducted out-of-class German speaking activities once a week for three weeks in total. Each speaking activity is lasted for one and a half hour per week. 7 volunteered students of German preparatory language programme attended the speaking activity for three weeks. The activity took place at a cafe in the university campus, that’s the reason, we call it as an out-of-class activity. The content of speaking activity is not related to the topics studied at the units of coursebook, that’s the reason, we call this activity as extra-curricular one. For data collection, three tools are used. A questionnaire, which is an adapted version of Sabo’s questionnaire, is applied to seven volunteers. An interview session is then held with each student on individual basis. The interview questions are developed so as to ask students to expand their answers that are given at the questionnaires. The German trainee teacher wrote fieldnotes, in which the teacher described the activity in the light of her thoughts about what went well and which areas were needed to be improved. The results of questionnaires show that six out of seven students note that such an acitivity must be conducted by a native speaker of German. Four out of seven students emphasize that they like the way that the activities are designed in a learner-centred fashion. All of the students point out that they feel motivated to talk to the trainee teacher in German. Six out of seven students note that the opportunity to communicate in German with the teacher and the peers enable them to improve their speaking skills, the use of grammatical rules and the use of vocabulary.Keywords: Learning a Foreign Language, Speaking Skills, Teaching German as a Foreign Language, Turkish Learners of German Language
Procedia PDF Downloads 321