Search results for: forming tools
3074 Assessment of Platelet and Lymphocyte Interaction in Autoimmune Hyperthyroidism
Authors: Małgorzata Tomczyńska, Joanna Saluk-Bijak
Abstract:
Background: Graves’ disease is a frequent organ-specific autoimmune thyroid disease, which characterized by the presence of different kind autoantibodies, that, in most cases, act as agonists of the thyrotropin receptor, leading to hyperthyroidism. Role of platelets and lymphocytes can be modulated in the pathophysiology of thyroid autoimmune diseases. Interference in the physiology of platelets can lead to enhanced activity of these cells. Activated platelets can bind to circulating lymphocytes and to affect lymphocyte adhesion. Platelets and lymphocytes can regulate mutual functions. Therefore, the activation of T lymphocytes, as well as blood platelets, is associated with the development of inflammation and oxidative stress within the target tissue. The present study was performed to investigate a platelet-lymphocyte relation by assessing the degree of their mutual aggregation in whole blood of patients with Graves’ disease. Also, the purpose of this study was to examine the impact of platelet interaction on lymphocyte migration capacity. Methods: 30 patients with Graves’ disease were recruited in the study. The matched 30 healthy subjects were served as the control group. Immunophenotyping of lymphocytes was carried out by flow cytometry method. A CytoSelect™ Cell Migration Assay Kit was used to evaluate lymphocyte migration and adhesion to blood platelets. Visual assessment of lymphocyte-platelet aggregate morphology was done using confocal microscope after magnetic cell isolation by Miltenyi Biotec. Results: The migration and functional responses of lymphocytes to blood platelets were greater in the group of Graves’ disease patients compared with healthy controls. The group of Graves’ disease patients exhibited a reduced T lymphocyte and a higher B cell count compared with controls. Based on microscopic analysis, more platelet-lymphocyte aggregates were found in patients than in control. Conclusions: Studies have shown that in Graves' disease, lymphocytes show increased platelet affinity, more strongly migrating toward them, and forming mutual cellular conglomerates. This may be due to the increased activation of blood platelets in this disease.Keywords: blood platelets, cell migration, Graves’ disease, lymphocytes, lymphocyte-platelet aggregates
Procedia PDF Downloads 2273073 Enzyme Involvement in the Biosynthesis of Selenium Nanoparticles by Geobacillus wiegelii Strain GWE1 Isolated from a Drying Oven
Authors: Daniela N. Correa-Llantén, Sebastián A. Muñoz-Ibacache, Mathilde Maire, Jenny M. Blamey
Abstract:
The biosynthesis of nanoparticles by microorganisms, on the contrary to chemical synthesis, is an environmentally-friendly process which has low energy requirements. In this investigation, we used the microorganism Geobacillus wiegelii, strain GWE1, an aerobic thermophile belonging to genus Geobacillus, isolated from a drying oven. This microorganism has the ability to reduce selenite evidenced by the change of color from colorless to red in the culture. Elemental analysis and composition of the particles were verified using transmission electron microscopy and energy-dispersive X-ray analysis. The nanoparticles have a defined spherical shape and a selenium elemental state. Previous experiments showed that the presence of the whole microorganism for the reduction of selenite was not necessary. The results strongly suggested that an intracellular NADPH/NADH-dependent reductase mediates selenium nanoparticles synthesis under aerobic conditions. The enzyme was purified and identified by mass spectroscopy MALDI-TOF TOF technique. The enzyme is a 1-pyrroline-5-carboxylate dehydrogenase. Histograms of nanoparticles sizes were obtained. Size distribution ranged from 40-160 nm, where 70% of nanoparticles have less than 100 nm in size. Spectroscopic analysis showed that the nanoparticles are composed of elemental selenium. To analyse the effect of pH in size and morphology of nanoparticles, the synthesis of them was carried out at different pHs (4.0, 5.0, 6.0, 7.0, 8.0). For thermostability studies samples were incubated at different temperatures (60, 80 and 100 ºC) for 1 h and 3 h. The size of all nanoparticles was less than 100 nm at pH 4.0; over 50% of nanoparticles have less than 100 nm at pH 5.0; at pH 6.0 and 8.0 over 90% of nanoparticles have less than 100 nm in size. At neutral pH (7.0) nanoparticles reach a size around 120 nm and only 20% of them were less than 100 nm. When looking at temperature effect, nanoparticles did not show a significant difference in size when they were incubated between 0 and 3 h at 60 ºC. Meanwhile at 80 °C the nanoparticles suspension lost its homogeneity. A change in size was observed from 0 h of incubation at 80ºC, observing a size range between 40-160 nm, with 20% of them over 100 nm. Meanwhile after 3 h of incubation at size range changed to 60-180 nm with 50% of them over 100 nm. At 100 °C the nanoparticles aggregate forming nanorod structures. In conclusion, these results indicate that is possible to modulate size and shape of biologically synthesized nanoparticles by modulating pH and temperature.Keywords: genus Geobacillus, NADPH/NADH-dependent reductase, selenium nanoparticles, biosynthesis
Procedia PDF Downloads 3153072 Delineato: Designing Distraction-Free GUIs
Authors: Fernando Miguel Campos, Fernando Jesus Aguiar Campos, Pedro Filipe Campos
Abstract:
A large amount of software products offer a wide range and number of features. This is called featurities or creeping featurism and tends to rise with each release of the product. Feautiris often adds unnecessary complexity to software, leading to longer learning curves and overall confusing the users and degrading their experience. We take a look to a new design approach tendency that has been coming up, the so-called “What You Get Is What You Need” concept that argues that products should be very focused, simple and with minimalistic interfaces in order to help users conduct their tasks in distraction-free ambiances. This is not as simple to implement as it might sound and the developers need to cut down features. Our contribution illustrates and evaluates this design method through a novel distraction-free diagramming tool named Delineato Pro for Mac OS X in which the user is confronted with an empty canvas when launching the software and where tools only show up when really needed.Keywords: diagramming, HCI, usability, user interface
Procedia PDF Downloads 5273071 Terrorism: A Threat in Constant Evolution Still Misunderstood
Authors: M. J. Gazapo Lapayese
Abstract:
It is a well-established fact that terrorism is one of the foremost threats to present-day international security. The creation of tools or mechanisms for confronting it in an effective and efficient manner will only be possible by way of an objective assessment of the phenomenon. In order to achieve this, this paper has the following three main objectives: Firstly, setting out to find the reasons that have prevented the establishment of a universally accepted definition of terrorism, and consequently trying to outline the main features defining the face of the terrorist threat in order to discover the fundamental goals of what is now a serious blight on world society. Secondly, trying to explain the differences between a terrorist movement and a terrorist organisation, and the reasons for which a terrorist movement can be led to transform itself into an organisation. After analysing these motivations and the characteristics of a terrorist organisation, an example of the latter will be succinctly analysed to help the reader understand the ideas expressed. Lastly, discovering and exposing the factors that can lead to the appearance of terrorist tendencies, and discussing the most efficient and effective responses that can be given to this global security threat.Keywords: responses, resilience, security, terrorism
Procedia PDF Downloads 4533070 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 2263069 Economic Policy of Achieving National Competitive Advantage
Authors: Gulnaz Erkomaishvili, Eteri Kharaishvili, Marina Chavleishvili
Abstract:
The paper discusses the economic policy of increasing national competitiveness, the tools, and means which help the country to improve its competitiveness. The sectors of the economy, in which the country can achieve a competitive advantage, are studied. It is noted that the country’s economic policy plays an important role in obtaining and maintaining a competitive advantage - authority should take measures to ensure a high level of education; scientific and research activities should be funded by the state; foreign direct investments should be attracted mainly in science-intensive industries; adaptation with the latest scientific achievements of the modern world and deepening of scientific and technical cooperation. Stable business environment and export-oriented strategy is the basis for the country’s economic growth. The studies have shown that institutional reforms in Georgia are not enough to significantly improve the country's competitiveness.Keywords: competitiveness, economic policy, competitiveness improvement strategy, competitiveness of Georgia
Procedia PDF Downloads 1283068 Chronically Ill Patient Satisfaction: An Indicator of Quality of Service Provided at Primary Health Care Settings in Alexandria
Authors: Alyaa Farouk Ibrahim, Gehan ElSayed, Ola Mamdouh, Nazek AbdelGhany
Abstract:
Background: Primary health care (PHC) can be considered the first contact between the patient and the health care system. It includes all the basic health care services to be provided to the community. Patient's satisfaction regarding health care has often improved the provision of care, also considered as one of the most important measures for evaluating the health care. Objective: This study aims to identify patient’s satisfaction with services provided at the primary health care settings in Alexandria. Setting: Seven primary health care settings representing the seven zones of Alexandria governorate were selected randomly and included in the study. Subjects: The study comprised 386 patients attended the previously selected settings at least twice before the time of the study. Tools: Two tools were utilized for data collection; sociodemographic characteristics and health status structured interview schedule and patient satisfaction scale. Reliability test for the scale was done using Cronbach's Alpha test, the result of the test ranged between 0.717 and 0.967. The overall satisfaction was computed and divided into high, medium, and low satisfaction. Results: Age of the studied sample ranged between 19 and 62 years, more than half (54.2%) of them aged 40 to less than 60 years. More than half (52.8%) of the patients included in the study were diabetics, 39.1% of them were hypertensive, 19.2% had cardiovascular diseases, the rest of the sample had tumor, liver diseases, and orthopedic/neurological disorders (6.5%, 5.2% & 3.2%, respectively). The vast majority of the study group mentioned high satisfaction with overall service cost, environmental conditions, medical staff attitude and health education given at the PHC settings (87.8%, 90.7%, 86.3% & 90.9%, respectively), however, medium satisfaction was mostly reported concerning medical checkup procedures, follow-up data and referral system (41.2%, 28.5% & 28.9%, respectively). Score level of patient satisfaction with health services provided at the assessed Primary health care settings proved to be significantly associated with patients’ social status (P=0.003, X²=14.2), occupation (P=0.011, X²=11.2), and monthly income (P=0.039, X²=6.50). In addition, a significant association was observed between score level of satisfaction and type of illness (P=0.007, X²=9.366), type of medication (P=0.014, X²=9.033), prior knowledge about the health center (P=0.050, X²=3.346), and highly significant with the administrative zone (P=0.001, X²=55.294). Conclusion: The current study revealed that overall service cost, environmental conditions, staff attitude and health education at the assessed primary health care settings gained high patient satisfaction level, while, medical checkup procedures, follow-up, and referral system caused a medium level of satisfaction among assessed patients. Nevertheless, social status, occupation, monthly income, type of illness, type of medication and administrative zones are all factors influencing patient satisfaction with services provided at the health facilities.Keywords: patient satisfaction, chronic illness, quality of health service, quality of service indicators
Procedia PDF Downloads 3523067 A Literature Review on Community Awareness, Education in Disaster Risk Reduction and Best Practices
Authors: Alwyn John Lim
Abstract:
Philippines is one of the most vulnerable areas to natural disasters in the world. Almost every year different types of natural disasters occur in Philippines and destroy many lives and resources of people. Although it is not possible to prevent the occurrence of disasters influenced by natural causes, proper plan and management such as disaster risk reduction may minimize the damage cause by natural disasters. Based on literature review this paper will analyze literatures on public/community awareness and education in disaster risk reduction that would help promote a country wide public disaster awareness and education program in the Philippines. This will include best practices and importance of community disaster awareness and education. The paper will also tackle ICT tools that will help boost the process and effectiveness of community/public disaster awareness and education.Keywords: community awareness, disaster education, disaster risk reduction, Philippines
Procedia PDF Downloads 5043066 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model
Authors: S. A. Sadegh Zadeh, C. Kambhampati
Abstract:
Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential
Procedia PDF Downloads 6173065 Study of Land Use Land Cover Change of Bhimbetka with Temporal Satellite Data and Information Systems
Authors: Pranita Shivankar, Devashree Hardas, Prabodhachandra Deshmukh, Arun Suryavanshi
Abstract:
Bhimbetka Rock Shelters is the UNESCO World Heritage Site located about 45 kilometers south of Bhopal in the state of Madhya Pradesh, India. Rapid changes in land use land cover (LULC) adversely affect the environment. In recent past, significant changes are found in the cultural landscape over a period of time. The objective of the paper was to study the changes in land use land cover (LULC) of Bhimbetka and its peripheral region. For this purpose, the supervised classification was carried out by using satellite images of Landsat and IRS LISS III for the year 2000 and 2013. Use of remote sensing in combination with geographic information system is one of the effective information technology tools to generate land use land cover (LULC) change information.Keywords: IRS LISS III, Landsat, LULC, UNESCO, World Heritage Site
Procedia PDF Downloads 3503064 Mechanisms and Process of an Effective Public Policy Formulation in Islamic Economic System
Authors: Md Abu Saieed
Abstract:
Crafting and implementing public policy is one of the indispensable works in any form of state and government. But the policy objectives, methods of formulation and tools of implementation might be different based on the ideological nature, historical legacy, structure and capacity of administration and management and other push and factors. Public policy in Islamic economic system needs to be based on the key guidelines of divine scriptures along with other sources of sharia’h. As a representative of Allah (SWT), the governor and other apparatus of the state will formulate and implement public policies which will enable to establish a true welfare state based on justice, equity and equality. The whole life of Prophet Muhammad (pbuh) and his policy in operating state of affairs in Madina is the practical guidelines for the policy actors and professionals in Islamic system of economics. Moreover, policy makers need to be more meticulous in formulating Islamic public policy which meets the needs and demands of contemporary worlds as well.Keywords: formulation, Islam, public policy, policy factors, Sharia’h
Procedia PDF Downloads 3533063 Control of Hybrid System Using Fuzzy Logic
Authors: Faiza Mahi, Fatima Debbat, Mohamed Fayçal Khelfi
Abstract:
This paper proposes a control approach using Fuzzy Lo system. More precisely, the study focuses on the improvement of users service in terms of analysis and control of a transportation system their waiting times in the exchange platforms of passengers. Many studies have been developed in the literature for such problematic, and many control tools are proposed. In this paper we focus on the use of fuzzy logic technique to control the system during its evolution in order to minimize the arrival gap of connected transportation means at the exchange points of passengers. An example of illustration is worked out and the obtained results are reported. an important area of research is the modeling and simulation ordering system. We describe an approach to analysis using Fuzzy Logic. The hybrid simulator developed in toolbox Matlab consists calculation of waiting time transportation mode.Keywords: Fuzzy logic, Hybrid system, Waiting Time, Transportation system, Control
Procedia PDF Downloads 5553062 Model Order Reduction Using Hybrid Genetic Algorithm and Simulated Annealing
Authors: Khaled Salah
Abstract:
Model order reduction has been one of the most challenging topics in the past years. In this paper, a hybrid solution of genetic algorithm (GA) and simulated annealing algorithm (SA) are used to approximate high-order transfer functions (TFs) to lower-order TFs. In this approach, hybrid algorithm is applied to model order reduction putting in consideration improving accuracy and preserving the properties of the original model which are two important issues for improving the performance of simulation and computation and maintaining the behavior of the original complex models being reduced. Compared to conventional mathematical methods that have been used to obtain a reduced order model of high order complex models, our proposed method provides better results in terms of reducing run-time. Thus, the proposed technique could be used in electronic design automation (EDA) tools.Keywords: genetic algorithm, simulated annealing, model reduction, transfer function
Procedia PDF Downloads 1433061 Defining a Holistic Approach for Model-Based System Engineering: Paradigm and Modeling Requirements
Authors: Hycham Aboutaleb, Bruno Monsuez
Abstract:
Current systems complexity has reached a degree that requires addressing conception and design issues while taking into account all the necessary aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponential growing effort, cost and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework and a environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and defines the refined functional as well as non functional requirements modeling tools needs to meet to be useful in model-based system engineering.Keywords: system modeling, modeling language, modeling requirements, framework
Procedia PDF Downloads 5323060 Polymer-Layered Gold Nanoparticles: Preparation, Properties and Uses of a New Class of Materials
Authors: S. M. Chabane sari S. Zargou, A.R. Senoudi, F. Benmouna
Abstract:
Immobilization of nano particles (NPs) is the subject of numerous studies pertaining to the design of polymer nano composites, supported catalysts, bioactive colloidal crystals, inverse opals for novel optical materials, latex templated-hollow inorganic capsules, immunodiagnostic assays; “Pickering” emulsion polymerization for making latex particles and film-forming composites or Janus particles; chemo- and biosensors, tunable plasmonic nano structures, hybrid porous monoliths for separation science and technology, biocidal polymer/metal nano particle composite coatings, and so on. Particularly, in the recent years, the literature has witnessed an impressive progress of investigations on polymer coatings, grafts and particles as supports for anchoring nano particles. This is actually due to several factors: polymer chains are flexible and may contain a variety of functional groups that are able to efficiently immobilize nano particles and their precursors by dispersive or van der Waals, electrostatic, hydrogen or covalent bonds. We review methods to prepare polymer-immobilized nano particles through a plethora of strategies in view of developing systems for separation, sensing, extraction and catalysis. The emphasis is on methods to provide (i) polymer brushes and grafts; (ii) monoliths and porous polymer systems; (iii) natural polymers and (iv) conjugated polymers as platforms for anchoring nano particles. The latter range from soft bio macromolecular species (proteins, DNA) to metallic, C60, semiconductor and oxide nano particles; they can be attached through electrostatic interactions or covalent bonding. It is very clear that physicochemical properties of polymers (e.g. sensing and separation) are enhanced by anchored nano particles, while polymers provide excellent platforms for dispersing nano particles for e.g. high catalytic performances. We thus anticipate that the synergetic role of polymeric supports and anchored particles will increasingly be exploited in view of designing unique hybrid systems with unprecedented properties.Keywords: gold, layer, polymer, macromolecular
Procedia PDF Downloads 3913059 Socio-Cultural Adaptation Approach to Enhance Intercultural Collaboration and Learning
Authors: Fadoua Ouamani, Narjès Bellamine Ben Saoud, Henda Hajjami Ben Ghézala
Abstract:
In the last few years and over the last decades, there was a growing interest in the development of Computer Supported Collaborative Learning (CSCL) environments. However, the existing systems ignore the variety of learners and their socio-cultural differences, especially in the case of distant and networked learning. In fact, within such collaborative learning environments, learners from different socio-cultural backgrounds may interact together. These learners evolve within various cultures and social contexts and acquire different socio-cultural values and behaviors. Thus, they should be assisted while communicating and collaborating especially in an intercultural group. Besides, the communication and collaboration tools provided to each learner must depend on and be adapted to her/his socio-cultural profile. The main goal of this paper is to present the proposed socio-cultural adaptation approach based on and guided by ontologies to adapt CSCL environments to the socio-cultural profiles of its users (learners or others).Keywords: CSCL, socio-cultural profile, adaptation, ontology
Procedia PDF Downloads 3613058 A Web and Cloud-Based Measurement System Analysis Tool for the Automotive Industry
Authors: C. A. Barros, Ana P. Barroso
Abstract:
Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.Keywords: automotive Industry, industry 4.0, Internet of Things, IATF 16949:2016, measurement system analysis
Procedia PDF Downloads 2143057 Technology Computer Aided Design Simulation of Space Charge Limited Conduction in Polycrystalline Thin Films
Authors: Kunj Parikh, S. Bhattacharya, V. Natarajan
Abstract:
TCAD numerical simulation is one of the most tried and tested powerful tools for designing devices in semiconductor foundries worldwide. It has also been used to explain conduction in organic thin films where the processing temperature is often enough to make homogeneous samples (often imperfect, but homogeneously imperfect). In this report, we have presented the results of TCAD simulation in multi-grain thin films. The work has addressed the inhomogeneity in one dimension, but can easily be extended to two and three dimensions. The effect of grain boundaries has mainly been approximated as barriers located at the junction between two adjacent grains. The effect of the value of grain boundary barrier, the bulk traps, and the measurement temperature have been investigated.Keywords: polycrystalline thin films, space charge limited conduction, Technology Computer-Aided Design (TCAD) simulation, traps
Procedia PDF Downloads 2143056 HD-WSComp: Hypergraph Decomposition for Web Services Composition Based on QoS
Authors: Samah Benmerbi, Kamal Amroun, Abdelkamel Tari
Abstract:
The increasing number of Web service (WS)providers throughout the globe, have produced numerous Web services providing the same or similar functionality. Therefore, there is a need of tools developing the best answer of queries by selecting and composing services with total transparency. This paper reviews various QoS based Web service selection mechanisms and architectures which facilitate qualitatively optimal selection, in other fact Web service composition is required when a request cannot be fulfilled by a single web service. In such cases, it is preferable to integrate existing web services to satisfy user’s request. We introduce an automatic Web service composition method based on hypergraph decomposition using hypertree decomposition method. The problem of selection and the composition of the web services is transformed into a resolution in a hypertree by exploring the relations of dependency between web services to get composite web service via employing an execution order of WS satisfying global request.Keywords: web service, web service selection, web service composition, QoS, hypergraph decomposition, BE hypergraph decomposition, hypertree resolution
Procedia PDF Downloads 5103055 Preparation and in vivo Assessment of Nystatin-Loaded Solid Lipid Nanoparticles for Topical Delivery against Cutaneous Candidiasis
Authors: Rawia M. Khalil, Ahmed A. Abd El Rahman, Mahfouz A. Kassem, Mohamed S. El Ridi, Mona M. Abou Samra, Ghada E. A. Awad, Soheir S. Mansy
Abstract:
Solid lipid nanoparticles (SLNs) have gained great attention for the topical treatment of skin associated fungal infection as they facilitate the skin penetration of loaded drugs. Our work deals with the preparation of nystatin loaded solid lipid nanoparticles (NystSLNs) using the hot homogenization and ultrasonication method. The prepared NystSLNs were characterized in terms of entrapment efficiency, particle size, zeta potential, transmission electron microscopy, differential scanning calorimetry, rheological behavior and in vitro drug release. A stability study for 6 months was performed. A microbiological study was conducted in male rats infected with Candida albicans, by counting the colonies and examining the histopathological changes induced on the skin of infected rats. The results showed that SLNs dispersions are spherical in shape with particle size ranging from 83.26±11.33 to 955.04±1.09 nm. The entrapment efficiencies are ranging from 19.73±1.21 to 72.46±0.66% with zeta potential ranging from -18.9 to -38.8 mV and shear-thinning rheological Behavior. The stability studies done for 6 months showed that nystatin (Nyst) is a good candidate for topical SLN formulations. A least number of colony forming unit/ ml (cfu/ml) was recorded for the selected NystSLN compared to the drug solution and the commercial Nystatin® cream present in the market. It can be fulfilled from this work that SLNs provide a good skin targeting effect and may represent promising carrier for topical delivery of Nyst offering the sustained release and maintaining the localized effect, resulting in an effective treatment of cutaneous fungal infection.Keywords: candida infections, hot homogenization, nystatin, solid lipid nanoparticles, stability, topical delivery
Procedia PDF Downloads 3933054 Risk Management in Islamic Micro Finance Credit System for Poverty Alleviation from Qualitative Perspective
Authors: Liyu Adhi Kasari Sulung
Abstract:
Poverty has been a major problem in Indonesia. Islamic micro finance (IMF) named Baitul Maal Wat Tamwil (Bmt) plays a prominent role to eradicate this. Indonesia as the biggest muslim country has many successful applied products such as worldwide adopt group-based lending approach, flexible financing for farmers, and gold pawning. The Problems related to these models are operation risk management and internal control system (ICS). A proper ICS will help an organization in preventing the occurrence of bad financing through detecting error and irregularities in its operation. This study aims to seek a proper risk management scheme of credit system in Bmt and internal control system’s rank for every stage. Risk management variables are obtained at the first In-Depth Interview (IDI) and Focus Group Discussion (FGD) with Shariah supervisory boards, boards of directors, and operational managers. Survey was conducted covering nationwide data; West Java, South Sulawesi, and West Nusa Tenggara. Moreover, Content analysis is employed to build the relationship among these variables. Research Findings shows that risk management Characteristics in Indonesia involves ex ante, credit process, and ex post strategies to deal with risk in credit system. Ex-ante control consists of Shariah compliance, survey, group leader reference, and islamic forming orientation. Then, credit process involves saving, collateral, joint liability, loan repayment, and credit installment controlling. Finally, ex-post control includes shariah evaluation, credit evaluation, grace period and low installment provisions. In addition, internal control order sort three stages by its priority; Credit process as first rank, then ex-post control as second, and ex ante control as the last rank.Keywords: internal control system, islamic micro finance, poverty, risk management
Procedia PDF Downloads 4093053 The Computational Psycholinguistic Situational-Fuzzy Self-Controlled Brain and Mind System Under Uncertainty
Authors: Ben Khayut, Lina Fabri, Maya Avikhana
Abstract:
The models of the modern Artificial Narrow Intelligence (ANI) cannot: a) independently and continuously function without of human intelligence, used for retraining and reprogramming the ANI’s models, and b) think, understand, be conscious, cognize, infer, and more in state of Uncertainty, and changes in situations, and environmental objects. To eliminate these shortcomings and build a new generation of Artificial Intelligence systems, the paper proposes a Conception, Model, and Method of Computational Psycholinguistic Cognitive Situational-Fuzzy Self-Controlled Brain and Mind System (CPCSFSCBMSUU) using a neural network as its computational memory, operating under uncertainty, and activating its functions by perception, identification of real objects, fuzzy situational control, forming images of these objects, modeling their psychological, linguistic, cognitive, and neural values of properties and features, the meanings of which are identified, interpreted, generated, and formed taking into account the identified subject area, using the data, information, knowledge, and images, accumulated in the Memory. The functioning of the CPCSFSCBMSUU is carried out by its subsystems of the: fuzzy situational control of all processes, computational perception, identifying of reactions and actions, Psycholinguistic Cognitive Fuzzy Logical Inference, Decision making, Reasoning, Systems Thinking, Planning, Awareness, Consciousness, Cognition, Intuition, Wisdom, analysis and processing of the psycholinguistic, subject, visual, signal, sound and other objects, accumulation and using the data, information and knowledge in the Memory, communication, and interaction with other computing systems, robots and humans in order of solving the joint tasks. To investigate the functional processes of the proposed system, the principles of Situational Control, Fuzzy Logic, Psycholinguistics, Informatics, and modern possibilities of Data Science were applied. The proposed self-controlled System of Brain and Mind is oriented on use as a plug-in in multilingual subject Applications.Keywords: computational brain, mind, psycholinguistic, system, under uncertainty
Procedia PDF Downloads 1773052 Effect of Cost Control and Cost Reduction Techniques in Organizational Performance
Authors: Babatunde Akeem Lawal
Abstract:
In any organization, the primary aim is to maximize profit, but the major challenges facing them is the increase in cost of operation because of this there is increase in cost of production that could lead to inevitable cost control and cost reduction scheme which make it difficult for most organizations to operate at the cost efficient frontier. The study aims to critically examine and evaluate the application of cost control and cost reduction in organization performance and also to review budget as an effective tool of cost control and cost reduction. A descriptive survey research was adopted. A total number of 40 respondent retrieved were used for the study. The analysis of data collected was undertaken by applying appropriate statistical tools. Regression analysis was used to test the hypothesis with the use of SPSS. Based on the findings; it was evident that cost control has a positive impact on organizational performance and also the style of management has a positive impact on organizational performance.Keywords: organization, cost reduction, cost control, performance, budget, profit
Procedia PDF Downloads 6033051 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico
Authors: Ismene Ithai Bras-Ruiz
Abstract:
Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise
Procedia PDF Downloads 1283050 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 273049 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 893048 Social Media as an Interactive Learning Tool Applied to Faculty of Tourism and Hotels, Fayoum University
Authors: Islam Elsayed Hussein
Abstract:
The aim of this paper is to discover the impact of students’ attitude towards social media and the skills required to adopt social media as a university e-learning (2.0) platform. In addition, it measures the effect of social media adoption on interactive learning effectiveness. The population of this study was students at Faculty of tourism and Hotels, Fayoum University. A questionnaire was used as a research instrument to collect data from respondents, which had been selected randomly. Data had been analyzed using quantitative data analysis method. Findings showed that the students have a positive attitude towards adopting social networking in the learning process and they have also good skills for effective use of social networking tools. In addition, adopting social media is effectively affecting the interactive learning environment.Keywords: attitude, skills, e-learning 2.0, interactive learning, Egypt
Procedia PDF Downloads 5253047 Innovative Business Education Pedagogy: A Case Study of Action Learning at NITIE, Mumbai
Authors: Sudheer Dhume, T. Prasad
Abstract:
There are distinct signs of Business Education losing its sheen. It is more so in developing countries. One of the reasons is the value addition at the end of 2 year MBA program is not matching with the requirements of present times and expectations of the students. In this backdrop, Pedagogy Innovation has become prerequisite for making our MBA programs relevant and useful. This paper is the description and analysis of innovative Action Learning pedagogical approach adopted by a group of faculty members at NITIE Mumbai. It not only promotes multidisciplinary research but also enhances integration of the functional areas skillsets in the students. The paper discusses the theoretical bases of this pedagogy and evaluates the effectiveness of it vis-à-vis conventional pedagogical tools. The evaluation research using Bloom’s taxonomy framework showed that this blended method of Business Education is much superior as compared to conventional pedagogy.Keywords: action learning, blooms taxonomy, business education, innovation, pedagogy
Procedia PDF Downloads 2703046 A Petri Net Model to Obtain the Throughput of Unreliable Production Lines in the Buffer Allocation Problem
Authors: Joselito Medina-Marin, Alexandr Karelin, Ana Tarasenko, Juan Carlos Seck-Tuoh-Mora, Norberto Hernandez-Romero, Eva Selene Hernandez-Gress
Abstract:
A production line designer faces with several challenges in manufacturing system design. One of them is the assignment of buffer slots in between every machine of the production line in order to maximize the throughput of the whole line, which is known as the Buffer Allocation Problem (BAP). The BAP is a combinatorial problem that depends on the number of machines and the total number of slots to be distributed on the production line. In this paper, we are proposing a Petri Net (PN) Model to obtain the throughput in unreliable production lines, based on PN mathematical tools and the decomposition method. The results obtained by this methodology are similar to those presented in previous works, and the number of machines is not a hard restriction.Keywords: buffer allocation problem, Petri Nets, throughput, production lines
Procedia PDF Downloads 3083045 Policy Recommendations for Reducing CO2 Emissions in Kenya's Electricity Generation, 2015-2030
Authors: Paul Kipchumba
Abstract:
Kenya is an East African Country lying at the Equator. It had a population of 46 million in 2015 with an annual growth rate of 2.7%, making a population of at least 65 million in 2030. Kenya’s GDP in 2015 was about 63 billion USD with per capita GDP of about 1400 USD. The rural population is 74%, whereas urban population is 26%. Kenya grapples with not only access to energy but also with energy security. There is direct correlation between economic growth, population growth, and energy consumption. Kenya’s energy composition is at least 74.5% from renewable energy with hydro power and geothermal forming the bulk of it; 68% from wood fuel; 22% from petroleum; 9% from electricity; and 1% from coal and other sources. Wood fuel is used by majority of rural and poor urban population. Electricity is mostly used for lighting. As of March 2015 Kenya had installed electricity capacity of 2295 MW, making a per capital electricity consumption of 0.0499 KW. The overall retail cost of electricity in 2015 was 0.009915 USD/ KWh (KES 19.85/ KWh), for installed capacity over 10MW. The actual demand for electricity in 2015 was 3400 MW and the projected demand in 2030 is 18000 MW. Kenya is working on vision 2030 that aims at making it a prosperous middle income economy and targets 23 GW of generated electricity. However, cost and non-cost factors affect generation and consumption of electricity in Kenya. Kenya does not care more about CO2 emissions than on economic growth. Carbon emissions are most likely to be paid by future costs of carbon emissions and penalties imposed on local generating companies by sheer disregard of international law on C02 emissions and climate change. The study methodology was a simulated application of carbon tax on all carbon emitting sources of electricity generation. It should cost only USD 30/tCO2 tax on all emitting sources of electricity generation to have solar as the only source of electricity generation in Kenya. The country has the best evenly distributed global horizontal irradiation. Solar potential after accounting for technology efficiencies such as 14-16% for solar PV and 15-22% for solar thermal is 143.94 GW. Therefore, the paper recommends adoption of solar power for generating all electricity in Kenya in order to attain zero carbon electricity generation in the country.Keywords: co2 emissions, cost factors, electricity generation, non-cost factors
Procedia PDF Downloads 365