Search results for: automated assembly
972 Student Attendance System Applying Reed Solomon ECC
Authors: Mohd Noah A. Rahman, Armandurni Abd Rahman, Afzaal H. Seyal, Md Rizal Md Hendry
Abstract:
The article reports an automated student attendance system modeled and developed for use at a Vocational school. This project focuses on developing an application using a QR code utilizing the Reed-Solomon error correction code using a smartphone scanned through a webcam. This system enables us to speed up the process of taking attendance and would save us valuable teaching time. This is planned to help students avoid consequences that may result from poor attendances which will eventually penalize them from sitting their final examination as required.Keywords: QR code, Reed-Solomon, error correction, system design.
Procedia PDF Downloads 392971 Artificial Intelligence Based Method in Identifying Tumour Infiltrating Lymphocytes of Triple Negative Breast Cancer
Authors: Nurkhairul Bariyah Baharun, Afzan Adam, Reena Rahayu Md Zin
Abstract:
Tumor microenvironment (TME) in breast cancer is mainly composed of cancer cells, immune cells, and stromal cells. The interaction between cancer cells and their microenvironment plays an important role in tumor development, progression, and treatment response. The TME in breast cancer includes tumor-infiltrating lymphocytes (TILs) that are implicated in killing tumor cells. TILs can be found in tumor stroma (sTILs) and within the tumor (iTILs). TILs in triple negative breast cancer (TNBC) have been demonstrated to have prognostic and potentially predictive value. The international Immune-Oncology Biomarker Working Group (TIL-WG) had developed a guideline focus on the assessment of sTILs using hematoxylin and eosin (H&E)-stained slides. According to the guideline, the pathologists use “eye balling” method on the H&E stained- slide for sTILs assessment. This method has low precision, poor interobserver reproducibility, and is time-consuming for a comprehensive evaluation, besides only counted sTILs in their assessment. The TIL-WG has therefore recommended that any algorithm for computational assessment of TILs utilizing the guidelines provided to overcome the limitations of manual assessment, thus providing highly accurate and reliable TILs detection and classification for reproducible and quantitative measurement. This study is carried out to develop a TNBC digital whole slide image (WSI) dataset from H&E-stained slides and IHC (CD4+ and CD8+) stained slides. TNBC cases were retrieved from the database of the Department of Pathology, Hospital Canselor Tuanku Muhriz (HCTM). TNBC cases diagnosed between the year 2010 and 2021 with no history of other cancer and available block tissue were included in the study (n=58). Tissue blocks were sectioned approximately 4 µm for H&E and IHC stain. The H&E staining was performed according to a well-established protocol. Indirect IHC stain was also performed on the tissue sections using protocol from Diagnostic BioSystems PolyVue™ Plus Kit, USA. The slides were stained with rabbit monoclonal, CD8 antibody (SP16) and Rabbit monoclonal, CD4 antibody (EP204). The selected and quality-checked slides were then scanned using a high-resolution whole slide scanner (Pannoramic DESK II DW- slide scanner) to digitalize the tissue image with a pixel resolution of 20x magnification. A manual TILs (sTILs and iTILs) assessment was then carried out by the appointed pathologist (2 pathologists) for manual TILs scoring from the digital WSIs following the guideline developed by TIL-WG 2014, and the result displayed as the percentage of sTILs and iTILs per mm² stromal and tumour area on the tissue. Following this, we aimed to develop an automated digital image scoring framework that incorporates key elements of manual guidelines (including both sTILs and iTILs) using manually annotated data for robust and objective quantification of TILs in TNBC. From the study, we have developed a digital dataset of TNBC H&E and IHC (CD4+ and CD8+) stained slides. We hope that an automated based scoring method can provide quantitative and interpretable TILs scoring, which correlates with the manual pathologist-derived sTILs and iTILs scoring and thus has potential prognostic implications.Keywords: automated quantification, digital pathology, triple negative breast cancer, tumour infiltrating lymphocytes
Procedia PDF Downloads 116970 Neuron-Based Control Mechanisms for a Robotic Arm and Hand
Authors: Nishant Singh, Christian Huyck, Vaibhav Gandhi, Alexander Jones
Abstract:
A robotic arm and hand controlled by simulated neurons is presented. The robot makes use of a biological neuron simulator using a point neural model. The neurons and synapses are organised to create a finite state automaton including neural inputs from sensors, and outputs to effectors. The robot performs a simple pick-and-place task. This work is a proof of concept study for a longer term approach. It is hoped that further work will lead to more effective and flexible robots. As another benefit, it is hoped that further work will also lead to a better understanding of human and other animal neural processing, particularly for physical motion. This is a multidisciplinary approach combining cognitive neuroscience, robotics, and psychology.Keywords: cell assembly, force sensitive resistor, robot, spiking neuron
Procedia PDF Downloads 349969 Non-Invasive Assessment of Peripheral Arterial Disease: Automated Ankle Brachial Index Measurement and Pulse Volume Analysis Compared to Ultrasound Duplex Scan
Authors: Jane E. A. Lewis, Paul Williams, Jane H. Davies
Abstract:
Introduction: There is, at present, a clear and recognized need to optimize the diagnosis of peripheral arterial disease (PAD), particularly in non-specialist settings such as primary care, and this arises from several key facts. Firstly, PAD is a highly prevalent condition. In 2010, it was estimated that globally, PAD affected more than 202 million people and furthermore, this prevalence is predicted to further escalate. The disease itself, although frequently asymptomatic, can cause considerable patient suffering with symptoms such as lower limb pain, ulceration, and gangrene which, in worse case scenarios, can necessitate limb amputation. A further and perhaps the most eminent consequence of PAD arises from the fact that it is a manifestation of systemic atherosclerosis and therefore is a powerful predictor of coronary heart disease and cerebrovascular disease. Objective: This cross sectional study aimed to individually and cumulatively compare sensitivity and specificity of the (i) ankle brachial index (ABI) and (ii) pulse volume waveform (PVW) recorded by the same automated device, with the presence or absence of peripheral arterial disease (PAD) being verified by an Ultrasound Duplex Scan (UDS). Methods: Patients (n = 205) referred for lower limb arterial assessment underwent an ABI and PVW measurement using volume plethysmography followed by a UDS. Presence of PAD was recorded for ABI if < 0.9 (noted if > 1.30) if PVW was graded as 2, 3 or 4 or a hemodynamically significant stenosis > 50% with UDS. Outcome measure was agreement between measured ABI and interpretation of the PVW for PAD diagnosis, using UDS as the reference standard. Results: Sensitivity of ABI was 80%, specificity 91%, and overall accuracy 88%. Cohen’s kappa revealed good agreement between ABI and UDS (k = 0.7, p < .001). PVW sensitivity 97%, specificity 81%, overall accuracy 84%, with a good level of agreement between PVW and UDS (k = 0.67, p < .001). The combined sensitivity of ABI and PVW was 100%, specificity 76%, and overall accuracy 85% (k = 0.67, p < .001). Conclusions: Combing these two diagnostic modalities within one device provided a highly accurate method of ruling out PAD. Such a device could be utilized within the primary care environment to reduce the number of unnecessary referrals to secondary care with concomitant cost savings, reduced patient inconvenience, and prioritization of urgent PAD cases.Keywords: ankle brachial index, peripheral arterial disease, pulse volume waveform, ultrasound duplex scan
Procedia PDF Downloads 166968 Pyramid Binary Pattern for Age Invariant Face Verification
Authors: Saroj Bijarnia, Preety Singh
Abstract:
We propose a simple and effective biometrics system based on face verification across aging using a new variant of texture feature, Pyramid Binary Pattern. This employs Local Binary Pattern along with its hierarchical information. Dimension reduction of generated texture feature vector is done using Principal Component Analysis. Support Vector Machine is used for classification. Our proposed method achieves an accuracy of 92:24% and can be used in an automated age-invariant face verification system.Keywords: biometrics, age invariant, verification, support vector machine
Procedia PDF Downloads 353967 The Environmental Impact of Wireless Technologies in Nigeria: An Overview of the IoT and 5G Network
Authors: Powei Happiness Kerry
Abstract:
Introducing wireless technologies in Nigeria have improved the quality of lives of Nigerians, however, not everyone sees it in that light. The paper on the environmental impact of wireless technologies in Nigeria summarizes the scholarly views on the impact of wireless technologies on the environment, beaming its searchlight on 5G and internet of things in Nigeria while also exploring the theory of the Technology Acceptance Model (TAM). The study used a qualitative research method to gather important data from relevant sources and contextually draws inference from the derived data. The study concludes that the Federal Government of Nigeria, before agreeing to any latest development in the world of wireless technologies, should weigh the implications and deliberate extensively with all stalk holders putting into consideration the confirmation it will receive from the National Assembly.Keywords: Internet of Things, radiofrequency, electromagnetic radiation, information and communications technology, ICT, 5G
Procedia PDF Downloads 134966 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions
Authors: Joel Niklaus, Matthias Sturmer
Abstract:
The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling
Procedia PDF Downloads 148965 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing
Authors: Arjun Kumar Rath, Titus Dhanasingh
Abstract:
Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.Keywords: board diagnostics software, embedded system, hardware testing, test frameworks
Procedia PDF Downloads 145964 Laban Movement Analysis Using Kinect
Authors: Bernstein Ran, Shafir Tal, Tsachor Rachelle, Studd Karen, Schuster Assaf
Abstract:
Laban Movement Analysis (LMA), developed in the dance community over the past seventy years, is an effective method for observing, describing, notating, and interpreting human movement to enhance communication and expression in everyday and professional life. Many applications that use motion capture data might be significantly leveraged if the Laban qualities will be recognized automatically. This paper presents an automated recognition method of Laban qualities from motion capture skeletal recordings and it is demonstrated on the output of Microsoft’s Kinect V2 sensor.Keywords: Laban movement analysis, multitask learning, Kinect sensor, machine learning
Procedia PDF Downloads 342963 Behavior of the Foundation of Bridge Reinforced by Rigid and Flexible Inclusions
Authors: T. Karech A. Noui, T. Bouzid
Abstract:
This article presents a comparative study by numerical analysis of the behavior of reinforcements of clayey soils by flexible columns (stone columns) and rigid columns (piles). The numerical simulation was carried out in 3D for an assembly of foundation, columns and a pile of a bridge. Particular attention has been paid to take into account the installation of the columns. Indeed, in practice, due to the compaction of the column, the soil around it sustains a lateral expansion and the horizontal stresses are increased. This lateral expansion of the column can be simulated numerically. This work represents a comparative study of the interaction between the soil on one side, and the two types of reinforcement on the other side, and their influence on the behavior of the soil and of the pile of a bridge.Keywords: piles, stone columns, interaction, foundation, settlement, consolidation
Procedia PDF Downloads 277962 Solutions for Quality Pre-Control of Crimp Contacts
Authors: C. F. Ocoleanu, G. Cividjian, Gh. Manolea
Abstract:
In this paper, we present two solutions for connections quality pre-control of Crimp Contacts and to identify in the first moments the connections improperly executed, before final assembly of a electrical machines. The first solution supposed experimental determination of specific losses by calculated the initial rate of temperature rise. This can be made drawing the tangent at the origin at heating curve. The method can be used to identify bad connections by passing a current through the winding at ambient temperature and simultaneously record connections temperatures in the first few minutes since the current is setting. The second proposed solution is to apply to each element crimping a thermal indicator one level, and making a test heating with a heating current corresponding to critical temperature indicator.Keywords: temperature, crimp contact, thermal indicator, current distribution, specific losses
Procedia PDF Downloads 422961 Transforming Breast Density Measurement with Artificial Intelligence: Population-Level Insights from BreastScreen NSW
Authors: Douglas Dunn, Ricahrd Walton, Matthew Warner-Smith, Chirag Mistry, Kan Ren, David Roder
Abstract:
Introduction: Breast density is a risk factor for breast cancer, both due to increased fibro glandular tissue that can harbor malignancy and the masking of lesions on mammography. Therefore, evaluation of breast density measurement is useful for risk stratification on an individual and population level. This study investigates the performance of Lunit INSIGHT MMG for automated breast density measurement. We analyze the reliability of Lunit compared to breast radiologists, explore density variations across the BreastScreen NSW population, and examine the impact of breast implants on density measurements. Methods: 15,518 mammograms were utilized for a comparative analysis of intra- and inter-reader reliability between Lunit INSIGHT MMG and breast radiologists. Subsequently, Lunit was used to evaluate 624,113 mammograms for investigation of density variations according to age and birth country, providing insights into diverse population subgroups. Finally, we compared breast density in 4,047 clients with implants to clients without implants, controlling for age and birth country. Results: Inter-reader variability between Lunit and Breast Radiologists weighted kappa coefficient was 0.72 (95%CI 0.71-0.73). Highest breast densities were seen in women with a North-East Asia background, whilst those of Aboriginal background had the lowest density. Across all backgrounds, density was demonstrated to reduce with age, though at different rates according to country of birth. Clients with implants had higher density relative to the age-matched no-implant strata. Conclusion: Lunit INSIGHT MMG demonstrates reasonable inter- and intra-observer reliability for automated breast density measurement. The scale of this study is significantly larger than any previous study assessing breast density due to the ability to process large volumes of data using AI. As a result, it provides valuable insights into population-level density variations. Our findings highlight the influence of age, birth country, and breast implants on density, emphasizing the need for personalized risk assessment and screening approaches. The large-scale and diverse nature of this study enhances the generalisability of our results, offering valuable information for breast cancer screening programs internationally.Keywords: breast cancer, screening, breast density, artificial intelligence, mammography
Procedia PDF Downloads 6960 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach
Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes
Abstract:
In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.Keywords: banking institutions, experimental approach, money laundering, risk assessment
Procedia PDF Downloads 267959 High Frequency Sonochemistry: A New Field of Cavitation‐Free Acoustic Materials Synthesis and Manipulation
Authors: Amgad Rezk, Heba Ahmed, Leslie Yeo
Abstract:
Ultrasound presents a powerful means for material synthesis. In this talk, we showcase a new field demonstrating the possibility for harnessing sound energy sources at considerably higher frequencies (10 MHz to 1 GHz) compared to conventional ultrasound (kHz and up to ~2 MHz) for crystalising and manipulating a variety of nanoscale materials. At these frequencies, cavitation—which underpins most sonochemical processes—is largely absent, suggesting that altogether fundamentally different mechanisms are at dominant. Examples include the crystallization of highly oriented structures, quasi-2D metal-organic frameworks and nanocomposites. These fascinating examples reveal how the highly nonlinear electromechanical coupling associated with high-frequency surface vibration gives rise to molecular ordering and assembly on the nano and microscale.Keywords: high-frequency acoustics, microfluidics, crystallisation, composite nanomaterials
Procedia PDF Downloads 121958 Automated Localization of Palpebral Conjunctiva and Hemoglobin Determination Using Smart Phone Camera
Authors: Faraz Tahir, M. Usman Akram, Albab Ahmad Khan, Mujahid Abbass, Ahmad Tariq, Nuzhat Qaiser
Abstract:
The objective of this study was to evaluate the Degree of anemia by taking the picture of the palpebral conjunctiva using Smartphone Camera. We have first localized the region of interest from the image and then extracted certain features from that Region of interest and trained SVM classifier on those features and then, as a result, our system classifies the image in real-time on their level of hemoglobin. The proposed system has given an accuracy of 70%. We have trained our classifier on a locally gathered dataset of 30 patients.Keywords: anemia, palpebral conjunctiva, SVM, smartphone
Procedia PDF Downloads 506957 Purification and Characterization of Phycoerythrin from a Mesophilic Cyanobacterium Nostoc piscinale PUPCCC 405.17
Authors: Sandeep Kaur
Abstract:
Phycoerythrin (PE) from the mesophilic filamentous cyanobacterium Nostoc piscinale PUPCCC 405.17, a good producer of phycobiliproteins, has been characterized in terms of their unit assembly and stability. The phycoerythrin was extracted by freeze-thawing the cells in water, concentrated by ammonium sulphate fractionation and purified by anion exchange chromatography. The purification process resulted in 2.90 fold increase in phycoerythrin purity reaching to 1.54. Sodium Dodecyl Sulphate- Polyacrylamide Gel Electrophoresis of purified PE demonstrated three protein bands of 14.3, 27.54 and 39.81 kDa. The native PE also showed one band of 125.87 kDa, assumed to be a dimer (αβ)2γ based on results of non-denaturing PAGE. Lyophilized powder PE was more stable compared to phycoerythrin in the solution. The half-life of dry PE is 80 days when stored at 4 °C under dark. The phycoerythrin from this organism has potential applications in food as natural colour and as a fluorescent marker.Keywords: characterization, Nostoc piscinale, phycoerythrin, purification
Procedia PDF Downloads 140956 Polymer Nanocarrier for Rheumatoid Arthritis Therapy
Authors: Vijayakameswara Rao Neralla, Jueun Jeon, Jae Hyung Park
Abstract:
To develop a potential nanocarrier for diagnosis and treatment of rheumatoid arthritis (RA), we prepared a hyaluronic acid (HA)-5β-cholanic acid (CA) conjugate with an acid-labile ketal linker. This conjugate could self-assemble in aqueous conditions to produce pH-responsive HA-CA nanoparticles as potential carriers of the anti-inflammatory drug methotrexate (MTX). MTX was rapidly released from nanoparticles under inflamed synovial tissue in RA. In vitro cytotoxicity data showed that pH-responsive HA-CA nanoparticles were non-toxic to RAW 264.7 cells. In vivo biodistribution results confirmed that, after their systemic administration, pH-responsive HA-CA nanoparticles selectively accumulated in the inflamed joints of collagen-induced arthritis mice. These results indicate that pH-responsive HA-CA nanoparticles represent a promising candidate as a drug carrier for RA therapy.Keywords: rheumatoid arthritis, hyaluronic acid, nanocarrier, self-assembly, MTX
Procedia PDF Downloads 289955 Nascent Federalism in Nepal: An Observational Review in its Evolution
Authors: C. Shekhar Parajulee
Abstract:
Nepal practiced a centralized unitary governing system for a long and has gone through the federal system after the promulgation of the new constitution on 20 September 2015. There is a big paradigm shift in terms of governance after it. Now, there are three levels of governments, one federal government in the center, seven provincial governments and 753 local governments. Federalism refers to a political governing system with multiple tiers of government working together with coordination. It is preferred for self and shared rule. Though it has opened the door for rights of the people, political stability, state restructuring, and sustainable peace and development, there are many prospects and challenges for its proper implementation. This research analyzes the discourses of federalism implementation in Nepal with special reference to one of seven provinces, Gandaki. Federalism is a new phenomenon in Nepali politics and informed debates on it are required for its right evolution. This research will add value in this regard. Moreover, tracking its evolution and the exploration of the attitudes and behaviors of key actors and stakeholders in a new experiment of a new governing system is also important. The administrative and political system of Gandaki province in terms of service delivery and development will critically be examined. Besides demonstrating the performances of the provincial government and assembly, it will analyze the inter-governmental relation of Gandaki with the other two tiers of government. For this research, people from provincial and local governments (elected representatives and government employees), provincial assembly members, academicians, civil society leaders and journalists are being interviewed. The interview findings will be analyzed by supplementing with published documents. Just going into the federal structure is not the solution. As in the case of other provincial governments, Gandaki had also to start from scratch. It gradually took a shape of government and has been functioning sluggishly. The provincial government has many challenges ahead, which has badly hindered its plans and actions. Additionally, fundamental laws, infrastructures and human resources are found to be insufficient at the sub-national level. Lack of clarity in the jurisdiction is another main challenge. The Nepali Constitution assumes cooperation, coexistence and coordination as the fundamental principles of federalism which, unfortunately, appear to be lacking among the three tiers of government despite their efforts. Though the devolution of power to sub-national governments is essential for the successful implementation of federalism, it has apparently been delayed due to the centralized mentality of bureaucracy as well as a political leader. This research will highlight the reasons for the delay in the implementation of federalism. There might be multiple underlying reasons for the slow pace of implementation of federalism and identifying them is very tough. Moreover, the federal spirit is found to be absent in the main players of today's political system, which is a big irony. So, there are some doubts about whether the federal system in Nepal is just a keepsake or a substantive.Keywords: federalism, inter-governmental relations, Nepal, provincial government
Procedia PDF Downloads 189954 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses
Authors: Matthew Baucum
Abstract:
With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.Keywords: FMRI, machine learning, meta-analysis, text analysis
Procedia PDF Downloads 449953 Formation of the Water Assisted Supramolecular Assembly in the Transition Structure of Organocatalytic Asymmetric Aldol Reaction: A DFT Study
Authors: Kuheli Chakrabarty, Animesh Ghosh, Atanu Roy, Gourab Kanti Das
Abstract:
Aldol reaction is an important class of carbon-carbon bond forming reactions. One of the popular ways to impose asymmetry in aldol reaction is the introduction of chiral auxiliary that binds the approaching reactants and create dissymmetry in the reaction environment, which finally evolves to enantiomeric excess in the aldol products. The last decade witnesses the usage of natural amino acids as chiral auxiliary to control the stereoselectivity in various carbon-carbon bond forming processes. In this context, L-proline was found to be an effective organocatalyst in asymmetric aldol additions. In last few decades the use of water as solvent or co-solvent in asymmetric organocatalytic reaction is increased sharply. Simple amino acids like L-proline does not catalyze asymmetric aldol reaction in aqueous medium not only that, In organic solvent medium high catalytic loading (~30 mol%) is required to achieve moderate to high asymmetric induction. In this context, huge efforts have been made to modify L-proline and 4-hydroxy-L-proline to prepare organocatalyst for aqueous medium asymmetric aldol reaction. Here, we report the result of our DFT calculations on asymmetric aldol reaction of benzaldehyde, p-NO2 benzaldehyde and t-butyraldehyde with a number of ketones using L-proline hydrazide as organocatalyst in wet solvent free condition. Gaussian 09 program package and Gauss View program were used for the present work. Geometry optimizations were performed using B3LYP hybrid functional and 6-31G(d,p) basis set. Transition structures were confirmed by hessian calculation and IRC calculation. As the reactions were carried out in solvent free condition, No solvent effect were studied theoretically. Present study has revealed for the first time, the direct involvement of two water molecules in the aldol transition structures. In the TS, the enamine and the aldehyde is connected through hydrogen bonding by the assistance of two intervening water molecules forming a supramolecular network. Formation of this type of supramolecular assembly is possible due to the presence of protonated -NH2 group in the L-proline hydrazide moiety, which is responsible for the favorable entropy contribution to the aldol reaction. It is also revealed from the present study that, water assisted TS is energetically more favorable than the TS without involving any water molecule. It can be concluded from this study that, insertion of polar group capable of hydrogen bond formation in the L-proline skeleton can lead to a favorable aldol reaction with significantly high enantiomeric excess in wet solvent free condition by reducing the activation barrier of this reaction.Keywords: aldol reaction, DFT, organocatalysis, transition structure
Procedia PDF Downloads 435952 Multi-Objective Optimization of an Aerodynamic Feeding System Using Genetic Algorithm
Authors: Jan Busch, Peter Nyhuis
Abstract:
Considering the challenges of short product life cycles and growing variant diversity, cost minimization and manufacturing flexibility increasingly gain importance to maintain a competitive edge in today’s global and dynamic markets. In this context, an aerodynamic part feeding system for high-speed industrial assembly applications has been developed at the Institute of Production Systems and Logistics (IFA), Leibniz Universitaet Hannover. The aerodynamic part feeding system outperforms conventional systems with respect to its process safety, reliability, and operating speed. In this paper, a multi-objective optimisation of the aerodynamic feeding system regarding the orientation rate, the feeding velocity and the required nozzle pressure is presented.Keywords: aerodynamic feeding system, genetic algorithm, multi-objective optimization, workpiece orientation
Procedia PDF Downloads 577951 Predictive Pathogen Biology: Genome-Based Prediction of Pathogenic Potential and Countermeasures Targets
Authors: Debjit Ray
Abstract:
Horizontal gene transfer (HGT) and recombination leads to the emergence of bacterial antibiotic resistance and pathogenic traits. HGT events can be identified by comparing a large number of fully sequenced genomes across a species or genus, define the phylogenetic range of HGT, and find potential sources of new resistance genes. In-depth comparative phylogenomics can also identify subtle genome or plasmid structural changes or mutations associated with phenotypic changes. Comparative phylogenomics requires that accurately sequenced, complete and properly annotated genomes of the organism. Assembling closed genomes requires additional mate-pair reads or “long read” sequencing data to accompany short-read paired-end data. To bring down the cost and time required of producing assembled genomes and annotating genome features that inform drug resistance and pathogenicity, we are analyzing the performance for genome assembly of data from the Illumina NextSeq, which has faster throughput than the Illumina HiSeq (~1-2 days versus ~1 week), and shorter reads (150bp paired-end versus 300bp paired end) but higher capacity (150-400M reads per run versus ~5-15M) compared to the Illumina MiSeq. Bioinformatics improvements are also needed to make rapid, routine production of complete genomes a reality. Modern assemblers such as SPAdes 3.6.0 running on a standard Linux blade are capable in a few hours of converting mixes of reads from different library preps into high-quality assemblies with only a few gaps. Remaining breaks in scaffolds are generally due to repeats (e.g., rRNA genes) are addressed by our software for gap closure techniques, that avoid custom PCR or targeted sequencing. Our goal is to improve the understanding of emergence of pathogenesis using sequencing, comparative genomics, and machine learning analysis of ~1000 pathogen genomes. Machine learning algorithms will be used to digest the diverse features (change in virulence genes, recombination, horizontal gene transfer, patient diagnostics). Temporal data and evolutionary models can thus determine whether the origin of a particular isolate is likely to have been from the environment (could it have evolved from previous isolates). It can be useful for comparing differences in virulence along or across the tree. More intriguing, it can test whether there is a direction to virulence strength. This would open new avenues in the prediction of uncharacterized clinical bugs and multidrug resistance evolution and pathogen emergence.Keywords: genomics, pathogens, genome assembly, superbugs
Procedia PDF Downloads 197950 Automated, Objective Assessment of Pilot Performance in Simulated Environment
Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt
Abstract:
Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).Keywords: automated assessment, flight simulator, human factors, pilot training
Procedia PDF Downloads 150949 Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools
Authors: Loke Mun Sei
Abstract:
Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, Robot Framework, and Jenkins.Keywords: test automation tools, test case, test execution, test reporting
Procedia PDF Downloads 583948 Multi-Layer Mn-Doped SnO2 Thin Film for Multi-State Resistive Switching
Authors: Zhemi Xu, Dewei Chu, Sean Li
Abstract:
Well self-assembled pure and Mn-doped SnO2 nanocubes were synthesized by interface thermodynamic method, which is ideal for highly homogeneous large scale thin film deposition on flexible substrates for various electric devices. Mn-doped SnO2 shows very good resistive switching with high On/Off ratio (over 103), endurance and retention characteristics. More important, the resistive state can be tuned by multi-layer fabrication by alternate pure SnO2 and Mn-doped SnO2 nanocube layer, which improved the memory capacity of resistive switching effectively. Thus, such a method provides transparent, multi-level resistive switching for next generation non-volatile memory applications.Keywords: metal oxides, self-assembly nanoparticles, multi-level resistive switching, multi-layer thin film
Procedia PDF Downloads 345947 Measuring Fluctuating Asymmetry in Human Faces Using High-Density 3D Surface Scans
Authors: O. Ekrami, P. Claes, S. Van Dongen
Abstract:
Fluctuating asymmetry (FA) has been studied for many years as an indicator of developmental stability or ‘genetic quality’ based on the assumption that perfect symmetry is ideally the expected outcome for a bilateral organism. Further studies have also investigated the possible link between FA and attractiveness or levels of masculinity or femininity. These hypotheses have been mostly examined using 2D images, and the structure of interest is usually presented using a limited number of landmarks. Such methods have the downside of simplifying and reducing the dimensionality of the structure, which will in return increase the error of the analysis. In an attempt to reach more conclusive and accurate results, in this study we have used high-resolution 3D scans of human faces and have developed an algorithm to measure and localize FA, taking a spatially-dense approach. A symmetric spatially dense anthropometric mask with paired vertices is non-rigidly mapped on target faces using an Iterative Closest Point (ICP) registration algorithm. A set of 19 manually indicated landmarks were used to examine the precision of our mapping step. The protocol’s accuracy in measurement and localizing FA is assessed using simulated faces with known amounts of asymmetry added to them. The results of validation of our approach show that the algorithm is perfectly capable of locating and measuring FA in 3D simulated faces. With the use of such algorithm, the additional captured information on asymmetry can be used to improve the studies of FA as an indicator of fitness or attractiveness. This algorithm can especially be of great benefit in studies of high number of subjects due to its automated and time-efficient nature. Additionally, taking a spatially dense approach provides us with information about the locality of FA, which is impossible to obtain using conventional methods. It also enables us to analyze the asymmetry of a morphological structures in a multivariate manner; This can be achieved by using methods such as Principal Components Analysis (PCA) or Factor Analysis, which can be a step towards understanding the underlying processes of asymmetry. This method can also be used in combination with genome wide association studies to help unravel the genetic bases of FA. To conclude, we introduced an algorithm to study and analyze asymmetry in human faces, with the possibility of extending the application to other morphological structures, in an automated, accurate and multi-variate framework.Keywords: developmental stability, fluctuating asymmetry, morphometrics, 3D image processing
Procedia PDF Downloads 141946 Study of the Behavior of Bolted Joints with and Without Reinforcement
Authors: Karim Akkouche
Abstract:
Many methods have been developed for characterizing the behavior of bolted joints. However, in the presence of a certain model of stiffeners, no orientation was given in relation to their modeling. To this end, multitude of coarse errors can arise in the reproduction of the propagation of efforts and in representation of the modes of deformations. Considering these particularities, a numerical investigation was carried out in our laboratory. In this paper we will present a comparative study between three types of assemblies. A non-linear 3D modeling was chosen, given that it takes into consideration geometric and material non-linearity, using the Finite Element calculation code ABAQUS. Initially, we evaluated the influence of the presence of each stiffener on the "global" behavior of the assemblies, this by analyzing their Moment-Rotation curves, also by referring to the classification system proposed by NF EN 1993- 1.8 which is based on the resisting moment Mj-Rd and the initial stiffness Sj.int. In a second step, we evaluated the "local" behavior of their components by referring to the stress-strain curves.Keywords: assembly, post-beam, end plate, nonlinearity
Procedia PDF Downloads 74945 Simple Fabrication of Au (111)-Like Electrode and Its Applications to Electrochemical Determination of Dopamine and Ascorbic Acid
Authors: Zahrah Thamer Althagafi, Mohamed I. Awad
Abstract:
A simple method for the fabrication of Au (111)-like electrode via controlled reductive desorption of a pre-adsorbed cysteine monolayer onto polycrystalline gold (poly-Au) electrode is introduced. Then, the voltammetric behaviour of dopamine (DA) and ascorbic acid (AA) on the thus modified electrode is investigated. Electrochemical characterization of the modified electrode is achieved using cyclic voltammetry and square wave voltammetry. For the binary mixture of DA and AA, the results showed that Au (111)-like electrode exhibits excellent electrocatalytic activity towards the oxidation of DA and AA. This allows highly selective and simultaneous determination of DA and AA. The effect of various experimental parameters on the voltammetric responses of DA and AA was investigated. The enrichment of the Au (111) facet of the poly-Au electrode is thought to be behind the electrocatalytic activity.Keywords: gold electrode, electroanalysis, electrocatalysis, monolayers, self-assembly, cysteine, dopamine, ascorbic acid
Procedia PDF Downloads 195944 Safety Conditions Analysis of Scaffolding on Construction Sites
Authors: M. Pieńko, A. Robak, E. Błazik-Borowa, J. Szer
Abstract:
This paper presents the results of analysis of 100 full-scale scaffolding structures in terms of compliance with legal acts and safety of use. In 2016 and 2017, authors examined scaffolds in Poland located at buildings which were at construction or renovation stage. The basic elements affecting the safety of scaffolding use such as anchors, supports, platforms, guardrails and toe-boards have been taken into account. All of these elements were checked in each of considered scaffolding. Based on the analyzed scaffoldings, the most common errors concerning assembly process and use of scaffolding were collected. Legal acts on the scaffoldings are not always clear, and this causes many issues. In practice, people realize how dangerous the use of incomplete scaffolds is only when the accident occurs. Despite the fact that the scaffolding should ensure the safety of its users, most accidents on construction sites are caused by fall from a height.Keywords: façade scaffolds, load capacity, practice, safety of people
Procedia PDF Downloads 403943 Computational Fluid Dynamics (CFD) Calculations of the Wind Turbine with an Adjustable Working Surface
Authors: Zdzislaw Kaminski, Zbigniew Czyz, Krzysztof Skiba
Abstract:
This paper discusses the CFD simulation of a flow around a rotor of a Vertical Axis Wind Turbine. Numerical simulation, unlike experiments, enables us to validate project assumptions when it is designed and avoid a costly preparation of a model or a prototype for a bench test. CFD simulation enables us to compare characteristics of aerodynamic forces acting on rotor working surfaces and define operational parameters like torque or power generated by a turbine assembly. This research focused on the rotor with the blades capable of modifying their working surfaces, i.e. absorbing wind kinetic energy. The operation of this rotor is based on adjusting angular aperture α of the top and bottom parts of the blades mounted on an axis. If this angular aperture α increases, the working surface which absorbs wind kinetic energy also increases. The operation of turbines is characterized by parameters like the angular aperture of blades, power, torque, speed for a given wind speed. These parameters have an impact on the efficiency of assemblies. The distribution of forces acting on the working surfaces in our turbine changes according to the angular velocity of the rotor. Moreover, the resultant force from the force acting on an advancing blade and retreating blade should be as high as possible. This paper is part of the research to improve an efficiency of a rotor assembly. Therefore, using simulation, the courses of the above parameters were studied in three full rotations individually for each of the blades for three angular apertures of blade working surfaces, i.e. 30 °, 60 °, 90 °, at three wind speeds, i.e. 4 m / s, 6 m / s, 8 m / s and rotor speeds ranging from 100 to 500 rpm. Finally, there were created the characteristics of torque coefficients and power as a function of time for each blade separately and for the entire rotor. Accordingly, the correlation between the turbine rotor power as a function of wind speed for varied values of rotor rotational speed. By processing this data, the correlation between the power of the turbine rotor and its rotational speed for each of the angular aperture of the working surfaces was specified. Finally, the optimal values, i.e. of the highest output power for given wind speeds were read. The research results in receiving the basic characteristics of turbine rotor power as a function of wind speed for the three angular apertures of the blades. Given the nature of rotor operation, the growth in the output turbine can be estimated if angular aperture of the blades increases. The controlled adjustment of angle α enables a smooth adjustment of power generated by a turbine rotor. If wind speed is significant, this type of adjustment enables this output power to remain at the same level (by reducing angle α) with no risk of damaging a construction. This work has been financed by the Polish Ministry of Science and Higher Education.Keywords: computational fluid dynamics, numerical analysis, renewable energy, wind turbine
Procedia PDF Downloads 217