Search results for: SIMD (Single Instruction Multiple Data) computers
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30660

Search results for: SIMD (Single Instruction Multiple Data) computers

30240 Ultrasound-Guided Single Shot Peripheral Nerve Block (ESP and PECS-2 Blocks) as a Sole Surgical Anesthesia on a 73-Year-Old Female with Phyllodes Tumor Undergoing Mastectomy in a Tertiary Hospital in Marawi City: A Case Report

Authors: Jaria A. Polog, Norjana Lao-Pimping, Bedoria M. Macabalang, Monrizah M. Guiling

Abstract:

Ultrasound-guided single-shot peripheral nerve blocks (PNB) such as erector spinae plane block (ESPB) and pectoral nerve block (PECS I and II) are emerging anesthetic techniques used to provide complete surgical anesthesia without general anesthesia and adequate postoperative analgesia for breast surgeries. Peripheral nerve blocks with sedation could be a better option for breast surgery than general anesthesia in patients with multiple preexisting comorbidities and those refusing general anesthesia. However, an anesthesiologist with optimum knowledge of regional anesthesia and ultrasound experience plays a key factor in the success of doing PNB.

Keywords: mastectomy, peripheral nerve block, erecto spinae block, PECS-2 block

Procedia PDF Downloads 13
30239 Inappropriate Effects Which the Use of Computer and Playing Video Games Have on Young People

Authors: Maja Ruzic-Baf, Mirjana Radetic-Paic

Abstract:

The use of computers by children has many positive aspects, including the development of memory, learning methods, problem-solving skills and the feeling of one’s own competence and self-confidence. Playing on line video games can encourage hanging out with peers having similar interests as well as communication; it develops coordination, spatial relations and presentation. On the other hand, the Internet enables quick access to different information and the exchange of experiences. How kids use computers and what the negative effects of this can be depends on various factors. ICT has improved and become easy to get for everyone. In the past 12 years so many video games has been made even to that level that some of them are free to play. Young people, even some adults, had simply start to forget about the real outside world because in that other, digital world, they have found something that makes them feal more worthy as a man. This article present the use of ICT, forms of behavior and addictions to on line video games. The use of computers by children has many positive aspects, including the development of memory, learning methods, problem-solving skills and the feeling of one’s own competence and self-confidence. Playing on line video games can encourage hanging out with peers having similar interests as well as communication; it develops coordination, spatial relations and presentation. On the other hand, the Internet enables quick access to different information and the exchange of experiences. How kids use computers and what the negative effects of this can be depends on various factors. ICT has improved and become easy to get for everyone. In the past 12 years so many video games has been made even to that level that some of them are free to play. Young people, even some adults, had simply start to forget about the real outside world because in that other, digital world, they have found something that makes them feal more worthy as a man. This article present the use of ICT, forms of behavior and addictions to on line video games.

Keywords: addiction to video games, behaviour, ICT, young people

Procedia PDF Downloads 530
30238 Enhancer: An Effective Transformer Architecture for Single Image Super Resolution

Authors: Pitigalage Chamath Chandira Peiris

Abstract:

A widely researched domain in the field of image processing in recent times has been single image super-resolution, which tries to restore a high-resolution image from a single low-resolution image. Many more single image super-resolution efforts have been completed utilizing equally traditional and deep learning methodologies, as well as a variety of other methodologies. Deep learning-based super-resolution methods, in particular, have received significant interest. As of now, the most advanced image restoration approaches are based on convolutional neural networks; nevertheless, only a few efforts have been performed using Transformers, which have demonstrated excellent performance on high-level vision tasks. The effectiveness of CNN-based algorithms in image super-resolution has been impressive. However, these methods cannot completely capture the non-local features of the data. Enhancer is a simple yet powerful Transformer-based approach for enhancing the resolution of images. A method for single image super-resolution was developed in this study, which utilized an efficient and effective transformer design. This proposed architecture makes use of a locally enhanced window transformer block to alleviate the enormous computational load associated with non-overlapping window-based self-attention. Additionally, it incorporates depth-wise convolution in the feed-forward network to enhance its ability to capture local context. This study is assessed by comparing the results obtained for popular datasets to those obtained by other techniques in the domain.

Keywords: single image super resolution, computer vision, vision transformers, image restoration

Procedia PDF Downloads 89
30237 Determinants of Repeated Abortion among Women of Reproductive Age Attending Health Facilities in Northern Ethiopia: A Case-Control Study

Authors: Henok Yebyo Henok, Araya Abrha Araya, Alemayehu Bayray Alemayehu, Gelila Goba Gelila

Abstract:

Background: Every year, an estimated 19–20 million unsafe abortions take place, almost all in developing countries, leading to 68,000 deaths and millions more injured many permanently. Many women throughout the world, experience more than one abortion in their lifetimes. Repeat abortion is an indicator of the larger problem of unintended pregnancy. This study aimed to identify determinants of repeat abortion in Tigray Region, Ethiopia. Methods: Unmatched case-control study was conducted in hospitals in Tigray Region, Northern Ethiopia, from November 2014 to June 2015. The sample included 105 cases and 204 controls, recruited from among women seeking abortion care at public hospitals. Clients having two or more abortions (“repeat abortion”) were taken as cases, and those who had a total of one abortion were taken as controls (“single abortion”). Cases were selected consecutive based on proportional to size allocation while systematic sampling was employed for controls. Data were analyzed using SPSS version 20.0. Binary and multiple variable logistic regression analyses were calculated with 95% CI. Results: Mean age of cases was 24 years (±6.85) and 22 years (±6.25) for controls. 79.0% of cases had their sexual debut in less than 18 years of age compared to 57% of controls. 42.2% of controls and 23.8% of cases cited rape as the reason for having an abortion. Study participants who did not understand their fertility cycle and when they were most likely to conceive after menstruation (adjusted odds ratio [AOR]=2.0, 95% confidence interval [CI]: 1.1-3.7), having a previous abortion using medication(AOR=3.3, CI: 1.83, 6.11), having multiple sexual partners in the preceding 12 months (AOR=4.4, CI: 2.39,8.45), perceiving that the abortion procedure is not painful (AOR=2.3, CI: 1.31,4.26), initiating sexual intercourse before the age of 18 years (AOR=2.7, CI: 1.49, 5.23) and disclosure to a third-party about terminating the pregnancy (AOR=2.1, CI: 1.2,3.83) were independent predictors of repeat abortion. Conclusion: This study identified several factors correlated with women having repeat abortions. It may be helpful for the Government of Ethiopia to encourage women to delay sexual debut and decrease their number of sexual partners, including by promoting discussion within families about sexuality, to decrease the occurrence of repeated abortion.

Keywords: abortion, Ethiopia, repeated abortion, single abortion

Procedia PDF Downloads 254
30236 [Keynote Talk]: Computer-Assisted Language Learning (CALL) for Teaching English to Speakers of Other Languages (TESOL/ESOL) as a Foreign Language (TEFL/EFL), Second Language (TESL/ESL), or Additional Language (TEAL/EAL)

Authors: Andrew Laghos

Abstract:

Computer-assisted language learning (CALL) is defined as the use of computers to help learn languages. In this study we look at several different types of CALL tools and applications and how they can assist Adults and Young Learners in learning the English language as a foreign, second or additional language. It is important to identify the roles of the teacher and the learners, and what the learners’ motivations are for learning the language. Audio, video, interactive multimedia games, online translation services, conferencing, chat rooms, discussion forums, social networks, social media, email communication, songs and music video clips are just some of the many ways computers are currently being used to enhance language learning. CALL may be used for classroom teaching as well as for online and mobile learning. Advantages and disadvantages of CALL are discussed and the study ends with future predictions of CALL.

Keywords: computer-assisted language learning (CALL), teaching English as a foreign language (TEFL/EFL), adult learners, young learners

Procedia PDF Downloads 410
30235 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication

Authors: Aishwarya Shekhar, Himanshu Sharma

Abstract:

Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.

Keywords: confidentiality, deduplication, data compression, hybridity of cloud

Procedia PDF Downloads 366
30234 Determination of Suitability Between Single Phase Induction Motor and Load

Authors: Nakarin Prempri

Abstract:

Single-phase induction motors are widely used in industry. Most manufacturing processes use capacitor-run single-phase induction motors to drive mechanical loads. The selection of a suitable motor for driving is important. The optimum operating range of the motor can help the motor operate efficiently. Thus, this paper presents an operating range analysis of capacitor-run single-phase induction motors and a determination of suitability between motor and mechanical loads. an observational study found that the optimum operating range of the motor can be used to determine the suitability between the motor and the mechanical load. Such considerations ensure that the motor uses no more current than necessary and operates efficiently.

Keywords: single phase induction motor, operating range, torque curve, efficiency curve

Procedia PDF Downloads 88
30233 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine

Procedia PDF Downloads 105
30232 Robotic Assisted vs Traditional Laparoscopic Partial Nephrectomy Peri-Operative Outcomes: A Comparative Single Surgeon Study

Authors: Gerard Bray, Derek Mao, Arya Bahadori, Sachinka Ranasinghe

Abstract:

The EAU currently recommends partial nephrectomy as the preferred management for localised cT1 renal tumours, irrespective of surgical approach. With the advent of robotic assisted partial nephrectomy, there is growing evidence that warm ischaemia time may be reduced compared to the traditional laparoscopic approach. There is still no clear differences between the two approaches with regards to other peri-operative and oncological outcomes. Current limitations in the field denote the lack of single surgeon series to compare the two approaches as other studies often include multiple operators of different experience levels. To the best of our knowledge, this study is the first single surgeon series comparing peri-operative outcomes of robotic assisted and laparoscopic PN. The current study aims to reduce intra-operator bias while maintaining an adequate sample size to assess the differences in outcomes between the two approaches. We retrospectively compared patient demographics, peri-operative outcomes, and renal function derangements of all partial nephrectomies undertaken by a single surgeon with experience in both laparoscopic and robotic surgery. Warm ischaemia time, length of stay, and acute renal function deterioration were all significantly reduced with robotic partial nephrectomy, compared to laparoscopic nephrectomy. This study highlights the benefits of robotic partial nephrectomy. Further prospective studies with larger sample sizes would be valuable additions to the current literature.

Keywords: partial nephrectomy, robotic assisted partial nephrectomy, warm ischaemia time, peri-operative outcomes

Procedia PDF Downloads 124
30231 Non-Linear Vibration and Stability Analysis of an Axially Moving Beam with Rotating-Prismatic Joint

Authors: M. Najafi, F. Rahimi Dehgolan

Abstract:

In this paper, the dynamic modeling of a single-link flexible beam with a tip mass is given by using Hamilton's principle. The link has been rotational and translational motion and it was assumed that the beam is moving with a harmonic velocity about a constant mean velocity. Non-linearity has been introduced by including the non-linear strain to the analysis. Dynamic model is obtained by Euler-Bernoulli beam assumption and modal expansion method. Also, the effects of rotary inertia, axial force, and associated boundary conditions of the dynamic model were analyzed. Since the complex boundary value problem cannot be solved analytically, the multiple scale method is utilized to obtain an approximate solution. Finally, the effects of several conditions on the differences among the behavior of the non-linear term, mean velocity on natural frequencies and the system stability are discussed.

Keywords: non-linear vibration, stability, axially moving beam, bifurcation, multiple scales method

Procedia PDF Downloads 347
30230 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals

Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti

Abstract:

Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.

Keywords: neuroinformatics, bioinformatics, network tools, brain mapping

Procedia PDF Downloads 156
30229 Performance Analysis in 5th Generation Massive Multiple-Input-Multiple-Output Systems

Authors: Jihad S. Daba, Jean-Pierre Dubois, Georges El Soury

Abstract:

Fifth generation wireless networks guarantee significant capacity enhancement to suit more clients and services at higher information rates with better reliability while consuming less power. The deployment of massive multiple-input-multiple-output technology guarantees broadband wireless networks with the use of base station antenna arrays to serve a large number of users on the same frequency and time-slot channels. In this work, we evaluate the performance of massive multiple-input-multiple-output systems (MIMO) systems in 5th generation cellular networks in terms of capacity and bit error rate. Several cases were considered and analyzed to compare the performance of massive MIMO systems while varying the number of antennas at both transmitting and receiving ends. We found that, unlike classical MIMO systems, reducing the number of transmit antennas while increasing the number of antennas at the receiver end provides a better solution to performance enhancement. In addition, enhanced orthogonal frequency division multiplexing and beam division multiple access schemes further improve the performance of massive MIMO systems and make them more reliable.

Keywords: beam division multiple access, D2D communication, enhanced OFDM, fifth generation broadband, massive MIMO

Procedia PDF Downloads 240
30228 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm

Authors: Hooman Torabifard

Abstract:

In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.

Keywords: image summarization, particle swarm optimization, image threshold, image processing

Procedia PDF Downloads 116
30227 From Theory to Practice: An Iterative Design Process in Implementing English Medium Instruction in Higher Education

Authors: Linda Weinberg, Miriam Symon

Abstract:

While few institutions of higher education in Israel offer international programs taught entirely in English, many Israeli students today can study at least one content course taught in English during their degree program. In particular, with the growth of international partnerships and opportunities for student mobility, English medium instruction is a growing phenomenon. There are however no official guidelines in Israel for how to develop and implement content courses in English and no training to help lecturers prepare for teaching their materials in a foreign language. Furthermore, the implications for the students and the nature of the courses themselves have not been sufficiently considered. In addition, the institution must have lecturers who are able to teach these courses effectively in English. An international project funded by the European Union addresses these issues and a set of guidelines which provide guidance for lecturers in adapting their courses for delivery in English have been developed. A train-the-trainer approach is adopted in order to cascade knowledge and experience in English medium instruction from experts to language teachers and on to content teachers thus maximizing the scope of professional development. To accompany training, a model English medium course has been created which serves the dual purpose of highlighting alternatives to the frontal lecture while integrating language learning objectives with content goals. This course can also be used as a standalone content course. The development of the guidelines and of the course utilized backwards, forwards and central design in an iterative process. The goals for combined language and content outcomes were identified first after which a suitable framework for achieving these goals was constructed. The assessment procedures evolved through collaboration between content and language specialists and subsequently were put into action during a piloting phase. Feedback from the piloting teachers and from the students highlight the need for clear channels of communication to encourage frank and honest discussion of expectations versus reality. While much of what goes on in the English medium classroom requires no better teaching skills than are required in any classroom, the understanding of students' abilities in achieving reasonable learning outcomes in a foreign language must be rationalized and accommodated within the course design. Concomitantly, preparatory language classes for students must be able to adapt to prepare students for specific language and cognitive skills and activities that courses conducted in English require. This paper presents findings from the implementation of a purpose-designed English medium instruction course arrived at through an iterative backwards, forwards and central design process utilizing feedback from students and lecturers alike leading to suggested guidelines for English medium instruction in higher education.

Keywords: English medium instruction, higher education, iterative design process, train-the-trainer

Procedia PDF Downloads 283
30226 On Flow Consolidation Modelling in Urban Congested Areas

Authors: Serban Stere, Stefan Burciu

Abstract:

The challenging and continuously growing competition in the urban freight transport market emphasizes the need for optimal planning of transportation processes in terms of identifying the solution of consolidating traffic flows in congested urban areas. The aim of the present paper is to present the mathematical framework and propose a methodology of combining urban traffic flows between the distribution centers located at the boundary of a congested urban area. The three scenarios regarding traffic flow between consolidation centers that are taken into consideration in the paper are based on the same characteristics of traffic flows. The scenarios differ in terms of the accessibility of the four consolidation centers given by the infrastructure, the connections between them, and the possibility of consolidating traffic flows for one or multiple destinations. Also, synthetical indicators will allow us to compare the scenarios considered and chose the indicated for our distribution system.

Keywords: distribution system, single and multiple destinations, urban consolidation centers, traffic flow consolidation schemes

Procedia PDF Downloads 139
30225 The Simulation of Superfine Animal Fibre Fractionation: The Strength Variation of Fibre

Authors: Sepehr Moradi

Abstract:

This study investigates the contribution of individual Australian Superfine Merino Wool (ASFW) and Inner Mongolia Cashmere (IMC) fibres strength behaviour to the breaking force variation (CVBF) and minimum fibre diameter (CVₘFD) induced by actual single fibre lengths and the combination of length and diameter groups. Mid-side samples were selected for the ASFW (n = 919) and IMC (n = 691) since it is assumed to represent the average of the whole fleece. The average (LₘFD) varied for ASFW and IMC by 36.6 % and 33.3 % from shortest to longest actual single fibre length and -21.2 % and -21.7 % between longest-coarsest and shortest-finest groups, respectively. The tensile properties of single animal fibres were characterised using Single Fibre Analyser (SIFAN 4). After normalising for diversity in fibre diameter at the position of breakage, the parameters, which explain the strength behaviour within actual fibre lengths and combination of length-diameter groups, were the Intrinsic Fibre Strength (IFS) (MPa), Min IFS (MPa), Max IFS (MPa) and Breaking force (BF) (cN). The average strength of single fibres varied extensively within actual length groups and within a combination of length-diameter groups. IFS ranged for ASFW and IMC from 419 to 355 MPa (-15.2 % range) and 353 to 319 (-9.6 % range) and BF from 2.2 to 3.6 (63.6 % range) and 3.2 to 5.3 cN (65.6 % range) from shortest to longest groups, respectively. Single fibre properties showed no differences within actual length groups and within a combination of length-diameter groups, or was there a strong interaction between the strength of single fibre (P > 0.05) within remaining and removing length-diameter groups. Longer-coarser fibre fractionation had a significant effect on BF and IFS and all of the length groups showed a considerable variance in single fibre strength that is accounted for by diversity in the diameter variation along the fibre. There are many concepts for the improvement of the stress-strain properties of animal fibres as a means of raising a single fibre strength by simultaneous changes in fibre length and diameter. Fibre fractionation over a given length directly for single fibre strength or using the variation traits of fibre diameter is an important process used to increase the strength of the single fibre.

Keywords: single animal fibre fractionation, actual length groups, strength variation, length-diameter groups, diameter variation along fibre

Procedia PDF Downloads 181
30224 Single Event Transient Tolerance Analysis in 8051 Microprocessor Using Scan Chain

Authors: Jun Sung Go, Jong Kang Park, Jong Tae Kim

Abstract:

As semi-conductor manufacturing technology evolves; the single event transient problem becomes more significant issue. Single event transient has a critical impact on both combinational and sequential logic circuits, so it is important to evaluate the soft error tolerance of the circuits at the design stage. In this paper, we present a soft error detecting simulation using scan chain. The simulation model generates a single event transient randomly in the circuit, and detects the soft error during the execution of the test patterns. We verified this model by inserting a scan chain in an 8051 microprocessor using 65 nm CMOS technology. While the test patterns generated by ATPG program are passing through the scan chain, we insert a single event transient and detect the number of soft errors per sub-module. The experiments show that the soft error rates per cell area of the SFR module is 277% larger than other modules.

Keywords: scan chain, single event transient, soft error, 8051 processor

Procedia PDF Downloads 327
30223 Factors Related to Teachers’ Analysis of Classroom Assessments

Authors: Hussain A. Alkharusi, Said S. Aldhafri, Hilal Z. Alnabhani, Muna Alkalbani

Abstract:

Analysing classroom assessments is one of the responsibilities of the teacher. It aims improving teacher’s instruction and assessment as well as student learning. The present study investigated factors that might explain variation in teachers’ practices regarding analysis of classroom assessments. The factors considered in the investigation included gender, in-service assessment training, teaching load, teaching experience, knowledge in assessment, attitude towards quantitative aspects of assessment, and self-perceived competence in analysing assessments. Participants were 246 in-service teachers in Oman. Results of a stepwise multiple linear regression analysis revealed that self-perceived competence was the only significant factor explaining the variance in teachers’ analysis of assessments. Implications for research and practice are discussed.

Keywords: analysis of assessment, classroom assessment, in-service teachers, self-competence

Procedia PDF Downloads 312
30222 A Double Acceptance Sampling Plan for Truncated Life Test Having Exponentiated Transmuted Weibull Distribution

Authors: A. D. Abdellatif, A. N. Ahmed, M. E. Abdelaziz

Abstract:

The main purpose of this paper is to design a double acceptance sampling plan under the time truncated life test when the product lifetime follows an exponentiated transmuted Weibull distribution. Here, the motive is to meet both the consumer’s risk and producer’s risk simultaneously at the specified quality levels, while the termination time is specified. A comparison between the results of the double and single acceptance sampling plans is conducted. We demonstrate the applicability of our results to real data sets.

Keywords: double sampling plan, single sampling plan, producer’s risk, consumer’s risk, exponentiated transmuted weibull distribution, time truncated experiment, single, double, Marshal-Olkin

Procedia PDF Downloads 470
30221 An Intelligent Thermal-Aware Task Scheduler in Multiprocessor System on a Chip

Authors: Sina Saadati

Abstract:

Multiprocessors Systems-On-Chips (MPSOCs) are used widely on modern computers to execute sophisticated software and applications. These systems include different processors for distinct aims. Most of the proposed task schedulers attempt to improve energy consumption. In some schedulers, the processor's temperature is considered to increase the system's reliability and performance. In this research, we have proposed a new method for thermal-aware task scheduling which is based on an artificial neural network (ANN). This method enables us to consider a variety of factors in the scheduling process. Some factors like ambient temperature, season (which is important for some embedded systems), speed of the processor, computing type of tasks and have a complex relationship with the final temperature of the system. This Issue can be solved using a machine learning algorithm. Another point is that our solution makes the system intelligent So that It can be adaptive. We have also shown that the computational complexity of the proposed method is cheap. As a consequence, It is also suitable for battery-powered systems.

Keywords: task scheduling, MOSOC, artificial neural network, machine learning, architecture of computers, artificial intelligence

Procedia PDF Downloads 89
30220 Computer Self-Efficacy, Study Behaviour and Use of Electronic Information Resources in Selected Polytechnics in Ogun State, Nigeria

Authors: Fredrick Olatunji Ajegbomogun, Bello Modinat Morenikeji, Okorie Nancy Chituru

Abstract:

Electronic information resources are highly relevant to students' academic and research needs but are grossly underutilized, despite the institutional commitment to making them available. The under-utilisation of these resources could be attributed to a low level of study behaviour coupled with a low level of computer self-efficacy. This study assessed computer self-efficacy, study behaviour, and the use of electronic information resources by students in selected polytechnics in Ogun State. A simple random sampling technique using Krejcie and Morgan's (1970) Table was used to select 370 respondents for the study. A structured questionnaire was used to collect data on respondents. Data were analysed using frequency counts, percentages, mean, standard deviation, Pearson Product Moment Correlation (PPMC) and multiple regression analysis. Results reveal that the internet (= 1.94), YouTube (= 1.74), and search engines (= 1.72) were the common information resources available to the students, while the Internet (= 4.22) is the most utilized resource. Major reasons for using electronic information resources were to source materials and information (= 3.30), for research (= 3.25), and to augment class notes (= 2.90). The majority (91.0%) of the respondents have a high level of computer self-efficacy in the use of electronic information resources through selecting from screen menus (= 3.12), using data files ( = 3.10), and efficient use of computers (= 3.06). Good preparation for tests (= 3.27), examinations (= 3.26), and organization of tutorials (= 3.11) are the common study behaviours of the respondents. Overall, 93.8% have good study behaviour. Inadequate computer facilities to access information (= 3.23), and poor internet access (= 2.87) were the major challenges confronting students’ use of electronic information resources. According to the PPMC results, study behavior (r = 0.280) and computer self-efficacy (r = 0.304) have significant (p 0.05) relationships with the use of electronic information resources. Regression results reveal that self-efficacy (=0.214) and study behavior (=0.122) positively (p 0.05) influenced students' use of electronic information resources. The study concluded that students' use of electronic information resources depends on the purpose, their computer self-efficacy, and their study behaviour. Therefore, the study recommended that the management should encourage the students to improve their study habits and computer skills, as this will enhance their continuous and more effective utilization of electronic information resources.

Keywords: computer self-efficacy, study behaviour, electronic information resources, polytechnics, Nigeria

Procedia PDF Downloads 100
30219 Innovations and Challenges: Multimodal Learning in Cybersecurity

Authors: Tarek Saadawi, Rosario Gennaro, Jonathan Akeley

Abstract:

There is rapidly growing demand for professionals to fill positions in Cybersecurity. This is recognized as a national priority both by government agencies and the private sector. Cybersecurity is a very wide technical area which encompasses all measures that can be taken in an electronic system to prevent criminal or unauthorized use of data and resources. This requires defending computers, servers, networks, and their users from any kind of malicious attacks. The need to address this challenge has been recognized globally but is particularly acute in the New York metropolitan area, home to some of the largest financial institutions in the world, which are prime targets of cyberattacks. In New York State alone, there are currently around 57,000 jobs in the Cybersecurity industry, with more than 23,000 unfilled positions. The Cybersecurity Program at City College is a collaboration between the Departments of Computer Science and Electrical Engineering. In Fall 2020, The City College of New York matriculated its first students in theCybersecurity Master of Science program. The program was designed to fill gaps in the previous offerings and evolved out ofan established partnership with Facebook on Cybersecurity Education. City College has designed a program where courses, curricula, syllabi, materials, labs, etc., are developed in cooperation and coordination with industry whenever possible, ensuring that students graduating from the program will have the necessary background to seamlessly segue into industry jobs. The Cybersecurity Program has created multiple pathways for prospective students to obtain the necessary prerequisites to apply in order to build a more diverse student population. The program can also be pursued on a part-time basis which makes it available to working professionals. Since City College’s Cybersecurity M.S. program was established to equip students with the advanced technical skills needed to thrive in a high-demand, rapidly-evolving field, it incorporates a range of pedagogical formats. From its outset, the Cybersecurity program has sought to provide both the theoretical foundations necessary for meaningful work in the field along with labs and applied learning projects aligned with skillsets required by industry. The efforts have involved collaboration with outside organizations and with visiting professors designing new courses on topics such as Adversarial AI, Data Privacy, Secure Cloud Computing, and blockchain. Although the program was initially designed with a single asynchronous course in the curriculum with the rest of the classes designed to be offered in-person, the advent of the COVID-19 pandemic necessitated a move to fullyonline learning. The shift to online learning has provided lessons for future development by providing examples of some inherent advantages to the medium in addition to its drawbacks. This talk will address the structure of the newly-implemented Cybersecurity Master’s Program and discuss the innovations, challenges, and possible future directions.

Keywords: cybersecurity, new york, city college, graduate degree, master of science

Procedia PDF Downloads 126
30218 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life

Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr

Abstract:

Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.

Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology

Procedia PDF Downloads 278
30217 High-Throughput, Purification-Free, Multiplexed Profiling of Circulating miRNA for Discovery, Validation, and Diagnostics

Authors: J. Hidalgo de Quintana, I. Stoner, M. Tackett, G. Doran, C. Rafferty, A. Windemuth, J. Tytell, D. Pregibon

Abstract:

We have developed the Multiplexed Circulating microRNA assay that allows the detection of up to 68 microRNA targets per sample. The assay combines particle­based multiplexing, using patented Firefly hydrogel particles, with single­ step RT-PCR signal. Thus, the Circulating microRNA assay leverages PCR sensitivity while eliminating the need for separate reverse transcription reactions and mitigating amplification biases introduced by target­-specific qPCR. Furthermore, the ability to multiplex targets in each well eliminates the need to split valuable samples into multiple reactions. Results from the Circulating microRNA assay are interpreted using Firefly Analysis Workbench, which allows visualization, normalization, and export of experimental data. To aid discovery and validation of biomarkers, we have generated fixed panels for Oncology, Cardiology, Neurology, Immunology, and Liver Toxicology. Here we present the data from several studies investigating circulating and tumor microRNA, showcasing the ability of the technology to sensitively and specifically detect microRNA biomarker signatures from fluid specimens.

Keywords: biomarkers, biofluids, miRNA, photolithography, flowcytometry

Procedia PDF Downloads 346
30216 Spatially Encoded Hyperspectral Compressive Microscope for Broadband VIS/NIR Imaging

Authors: Lukáš Klein, Karel Žídek

Abstract:

Hyperspectral imaging counts among the most frequently used multidimensional sensing methods. While there are many approaches to capturing a hyperspectral data cube, optical compression is emerging as a valuable tool to reduce the setup complexity and the amount of data storage needed. Hyperspectral compressive imagers have been created in the past; however, they have primarily focused on relatively narrow sections of the electromagnetic spectrum. A broader spectral study of samples can provide helpful information, especially for applications involving the harmonic generation and advanced material characterizations. We demonstrate a broadband hyperspectral microscope based on the single-pixel camera principle. Captured spatially encoded data are processed to reconstruct a hyperspectral cube in a combined visible and near-infrared spectrum (from 400 to 2500 nm). Hyperspectral cubes can be reconstructed with a spectral resolution of up to 3 nm and spatial resolution of up to 7 µm (subject to diffraction) with a high compressive ratio.

Keywords: compressive imaging, hyperspectral imaging, near-infrared spectrum, single-pixel camera, visible spectrum

Procedia PDF Downloads 78
30215 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People

Authors: Marlene Rosa, Susana Lopes

Abstract:

There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.

Keywords: board game, aging, executive function, evaluation

Procedia PDF Downloads 127
30214 3D Printing of Dual Tablets: Modified Multiple Release Profiles for Personalized Medicine

Authors: Veronika Lesáková, Silvia Slezáková, František Štěpánek

Abstract:

Additive manufacturing technologies producing drug dosage forms aimed at personalized medicine applications are promising strategies with several advantages over the conventional production methods. One of the emerging technologies is 3D printing which reduces manufacturing steps and thus allows a significant drop in expenses. A decrease in material consumption is also a highly impactful benefit as the tested drugs are frequently expensive substances. In addition, 3D printed dosage forms enable increased patient compliance and prevent misdosing as the dosage forms are carefully designed according to the patient’s needs. The incorporation of multiple drugs into a single dosage form further increases the degree of personalization. Our research focuses on the development of 3D printed tablets incorporating multiple drugs (candesartan, losartan) and thermoplastic polymers (e.g., KlucelTM HPC EF). The filaments, an essential feed material for 3D printing,wereproduced via hot-melt extrusion. Subsequently, the extruded filaments of various formulations were 3D printed into tablets using an FDM 3D printer. Then, we have assessed the influence of the internal structure of 3D printed tablets and formulation on dissolution behaviour by obtaining the dissolution profiles of drugs present in the 3D printed tablets. In conclusion, we have developed tablets containing multiple drugs providing modified release profiles. The 3D printing experiments demonstrate the high tunability of 3D printing as each tablet compartment is constructed with a different formulation. Overall, the results suggest that the 3D printing technology is a promising manufacturing approach to dual tablet preparation for personalized medicine.

Keywords: 3D printing, drug delivery, hot-melt extrusion, dissolution kinetics

Procedia PDF Downloads 153
30213 Interactive Multiple Functions User Interface

Authors: Manjit Singh Sidhu, Waleed Maqableh, Jee Geak Ying

Abstract:

Tangible user interfaces (TUI) that employ markers in the augmented reality (AR) environment has hampered the interactivity between the user and the software application. This is because the user lacks focus on visualizing the contents due to the interaction mechanisms whereby multiple markers may need to be used to perform a particular function. In this research, we have designed a novel TUI user interface where multiple functions could be triggered similar to a natural keyboard thus allowing user to focus more on its digital contents such as 2D/3D, text input, animation and sound. Test results of the user interface with potential users and HCI experts revealed that the multiple functions user interface was new, preferred and appreciated more as opposed to marker based user interface.

Keywords: multimedia, augmented reality, engineering, user interface, visualization

Procedia PDF Downloads 431
30212 Hyper-Immunoglobulin E (Hyper-Ige) Syndrome In Skin Of Color: A Retrospective Single-Centre Observational Study

Authors: Rohit Kothari, Muneer Mohamed, Vivekanandh K., Sunmeet Sandhu, Preema Sinha, Anuj Bhatnagar

Abstract:

Introduction: Hyper-IgE syndrome is a rare primary immunodeficiency syndrome characterised by triad of severe atopic dermatitis, recurrent pulmonary infections, and recurrent staphylococcal skin infections. The diagnosis requires a high degree of suspicion, typical clinical features, and not mere rise in serum-IgE levels, which may be seen in multiple conditions. Genetic studies are not always possible in a resource poor setting. This study highlights various presentations of Hyper-IgE syndrome in skin of color children. Case-series: Our study had six children of Hyper-IgE syndrome aged twomonths to tenyears. All had onset in first ten months of life except one with a late-onset at two years. All had recurrent eczematoid rash, which responded poorly to conventional treatment, secondary infection, multiple episodes of hospitalisation for pulmonary infection, and raised serum IgE levels. One case had occasional vesicles, bullae, and crusted plaques over both the extremities. Genetic study was possible in only one of them who was found to have pathogenic homozygous deletions of exon-15 to 18 in DOCK8 gene following which he underwent bone marrow transplant (BMT), however, succumbed to lower respiratory tract infection two months after BMT and rest of them received multiple courses of antibiotics, oral/ topical steroids, and cyclosporine intermittently with variable response. Discussion: Our study highlights various characteristics, presentation, and management of this rare syndrome in children. Knowledge of these manifestations in skin of color will facilitate early identification and contribute to optimal care of the patients as representative data on the same is limited in literature.

Keywords: absolute eosinophil count, atopic dermatitis, eczematous rash, hyper-immunoglobulin E syndrome, pulmonary infection, serum IgE, skin of color

Procedia PDF Downloads 119
30211 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 416