Search results for: portable document format
1496 A Biomechanical Perfusion System for Microfluidic 3D Bioprinted Structure
Authors: M. Dimitri, M. Ricci, F. Bigi, M. Romiti, A. Corvi
Abstract:
Tissue engineering has reached a significant milestone with the integration of 3D printing for the creation of complex bioconstructs equipped with vascular networks, crucial for cell maintenance and growth. This study aims to demonstrate the effectiveness of a portable microperfusion system designed to adapt dynamically to the evolving conditions of cell growth within 3D-printed bioconstructs. The microperfusion system was developed to provide a constant and controlled flow of nutrients and oxygen through the integrated vessels in the bioconstruct, replicating in vivo physiological conditions. Through a series of preliminary experiments, we evaluated the system's ability to maintain a favorable environment for cell proliferation and differentiation. Measurements of cell density and viability were performed to monitor the health and functionality of the tissue over time. Preliminary results indicate that the portable microperfusion system not only supports but optimizes cell growth, effectively adapting to changes in metabolic needs during the bioconstruct maturation process. This research opens perspectives in tissue engineering, demonstrating that a portable microperfusion system can be successfully integrated into 3D-printed bioconstructs, promoting sustainable and uniform cell growth. The implications of this study are far-reaching, with potential applications in regenerative medicine and pharmacological research, providing a platform for the development of functional and complex tissues.Keywords: biofabrication, microfluidic perfusion system, 4D bioprinting
Procedia PDF Downloads 331495 Possibility of Creating Polygon Layers from Raster Layers Obtained by using Classic Image Processing Software: Case of Geological Map of Rwanda
Authors: Louis Nahimana
Abstract:
Most maps are in a raster or pdf format and it is not easy to get vector layers of published maps. Faced to the production of geological simplified map of the northern Lake Tanganyika countries without geological information in vector format, I tried a method of obtaining vector layers from raster layers created from geological maps of Rwanda and DR Congo in pdf and jpg format. The procedure was as follows: The original raster maps were georeferenced using ArcGIS10.2. Under Adobe Photoshop, map areas with the same color corresponding to a lithostratigraphic unit were selected all over the map and saved in a specific raster layer. Using the same image processing software Adobe Photoshop, each RGB raster layer was converted in grayscale type and improved before importation in ArcGIS10. After georeferencing, each lithostratigraphic raster layer was transformed into a multitude of polygons with the tool "Raster to Polygon (Conversion)". Thereafter, tool "Aggregate Polygons (Cartography)" allowed obtaining a single polygon layer. Repeating the same steps for each color corresponding to a homogeneous rock unit, it was possible to reconstruct the simplified geological constitution of Rwanda and the Democratic Republic of Congo in vector format. By using the tool «Append (Management)», vector layers obtained were combined with those from Burundi to achieve vector layers of the geology of the « Northern Lake Tanganyika countries ».Keywords: creating raster layer under image processing software, raster to polygon, aggregate polygons, adobe photoshop
Procedia PDF Downloads 4441494 Research and Design on a Portable Intravehicular Ultrasonic Leak Detector for Manned Spacecraft
Authors: Yan Rongxin, Sun Wei, Li Weidan
Abstract:
Based on the acoustics cascade sound theory, the mechanism of air leak sound producing, transmitting and signal detecting has been analyzed. A formula of the sound power, leak size and air pressure in the spacecraft has been built, and the relationship between leak sound pressure and receiving direction and distance has been studied. The center frequency in millimeter diameter leak is more than 20 kHz. The situation of air leaking from spacecraft to space has been simulated and an experiment of different leak size and testing distance and direction has been done. The sound pressure is in direct proportion to the cosine of the angle of leak to sensor. The portable ultrasonic leak detector has been developed, whose minimal leak rate is 10-1 Pa·m3/s, the testing radius is longer than 20 mm, the mass is less than 1.0 kg, and the electric power is less than 2.2 W.Keywords: leak testing, manned spacecraft, sound transmitting, ultrasonic
Procedia PDF Downloads 3291493 A Lost Tradition: Reflections towards Select Tribal Songs of Odisha
Authors: Akshaya K. Rath, Manjit Mahanta
Abstract:
The paper aims at examining the oral tradition of the Kondh and Oroan people of Odisha. Highlighting the translated versions of Kondh and Oroan songs—chiefly highlighting issues on agriculture—we argue that the relevance of these songs have fallen apart in the recent decades with the advancement of modern knowledge and thinking. What remains instead is a faint voice in the oral tradition that sings the past indigenous knowledge in the form of oral literature. Though there have been few attempts to document the rich cultural tradition by some individuals—Sitakant Mahapatra’s can be cited as an example—the need to document the tradition remains ever arching. In short, the thesis examines Kondh and Oroan “songs” and argues for a need to document the tradition. It also shows a comparative study on both the tribes on Agriculture which shows their cultural identity and a diversification of both the tribes in nature and how these tribal groups are associated with nature and the cycle of it.Keywords: oral tradition, Meriah, folklore, karma, Oroan
Procedia PDF Downloads 4701492 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications
Authors: K. P. Sandesh, M. H. Suman
Abstract:
Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms
Procedia PDF Downloads 5181491 One-Class Support Vector Machine for Sentiment Analysis of Movie Review Documents
Authors: Chothmal, Basant Agarwal
Abstract:
Sentiment analysis means to classify a given review document into positive or negative polar document. Sentiment analysis research has been increased tremendously in recent times due to its large number of applications in the industry and academia. Sentiment analysis models can be used to determine the opinion of the user towards any entity or product. E-commerce companies can use sentiment analysis model to improve their products on the basis of users’ opinion. In this paper, we propose a new One-class Support Vector Machine (One-class SVM) based sentiment analysis model for movie review documents. In the proposed approach, we initially extract features from one class of documents, and further test the given documents with the one-class SVM model if a given new test document lies in the model or it is an outlier. Experimental results show the effectiveness of the proposed sentiment analysis model.Keywords: feature selection methods, machine learning, NB, one-class SVM, sentiment analysis, support vector machine
Procedia PDF Downloads 5191490 The Appropriate Number of Test Items That a Classroom-Based Reading Assessment Should Include: A Generalizability Analysis
Authors: Jui-Teng Liao
Abstract:
The selected-response (SR) format has been commonly adopted to assess academic reading in both formal and informal testing (i.e., standardized assessment and classroom assessment) because of its strengths in content validity, construct validity, as well as scoring objectivity and efficiency. When developing a second language (L2) reading test, researchers indicate that the longer the test (e.g., more test items) is, the higher reliability and validity the test is likely to produce. However, previous studies have not provided specific guidelines regarding the optimal length of a test or the most suitable number of test items or reading passages. Additionally, reading tests often include different question types (e.g., factual, vocabulary, inferential) that require varying degrees of reading comprehension and cognitive processes. Therefore, it is important to investigate the impact of question types on the number of items in relation to the score reliability of L2 reading tests. Given the popularity of the SR question format and its impact on assessment results on teaching and learning, it is necessary to investigate the degree to which such a question format can reliably measure learners’ L2 reading comprehension. The present study, therefore, adopted the generalizability (G) theory to investigate the score reliability of the SR format in L2 reading tests focusing on how many test items a reading test should include. Specifically, this study aimed to investigate the interaction between question types and the number of items, providing insights into the appropriate item count for different types of questions. G theory is a comprehensive statistical framework used for estimating the score reliability of tests and validating their results. Data were collected from 108 English as a second language student who completed an English reading test comprising factual, vocabulary, and inferential questions in the SR format. The computer program mGENOVA was utilized to analyze the data using multivariate designs (i.e., scenarios). Based on the results of G theory analyses, the findings indicated that the number of test items had a critical impact on the score reliability of an L2 reading test. Furthermore, the findings revealed that different types of reading questions required varying numbers of test items for reliable assessment of learners’ L2 reading proficiency. Further implications for teaching practice and classroom-based assessments are discussed.Keywords: second language reading assessment, validity and reliability, Generalizability theory, Academic reading, Question format
Procedia PDF Downloads 911489 A Short Study on the Effects of Public Service Advertisement on Gender Bias in Accessible and Non-Accessible Format
Authors: Amrin Moger, Sagar Bhalerao, Martin Mathew
Abstract:
Advertisements play a vital role in dissemination of information regarding products and services. Advertisements as Mass Media tool is not only a source of entertainment, but also a source of information, education and entertainment. It provides information about the outside world and exposes us to other ways of life and culture. Public service advertisements (PSA) are generally aimed at public well-being. Aim of PSA is not to make profit, but rather to change public opinion and raise awareness in the Society about a social issue.’ Start with the boys’ is one such PSA aims to create awareness about issue of ‘gender bias’ that is taught prevalent in the society. Persons with disabilities (PWDs) are also consumers of PSA in the society. The population of persons with disability in the society also faces gender bias and discrimination. It is a double discrimination. The advertisement selected for the study gives out a strong message on gender bias and therefore must be accessible to everyone including PWDs in the society. Accessibility of PSA in the digital format can be done with the help of Universal Design (UD) in digital media application. Features of UD inclusive in nature, and it focus on eliminating established barriers through initial designs. It considers the needs of diverse people, whether they are persons with or without disability. In this research two aspects of UD in digital media: captioning and Indian sign language (ISL) is used. Hence a short survey study was under taken to know the effects of a multimedia on gender bias, in accessible format on persons with and without disability. The result demonstrated a significant difference in the opinion, on the usage accessible and non-accessible format for persons with and without disability and their understanding of message in the PSA selected for the study.Keywords: public service advertisements, gender, disability, accessibility
Procedia PDF Downloads 3551488 Design, Construction, Validation And Use Of A Novel Portable Fire Effluent Sampling Analyser
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
Current large scale fire tests focus on flammability and heat release measurements. Smoke toxicity isn’t considered despite it being a leading cause of death and injury in unwanted fires. A key reason could be that the practical difficulties associated with quantifying individual toxic components present in a fire effluent often require specialist equipment and expertise. Fire effluent contains a mixture of unreactive and reactive gases, water, organic vapours and particulate matter, which interact with each other. This interferes with the operation of the analytical instrumentation and must be removed without changing the concentration of the target analyte. To mitigate the need for expensive equipment and time-consuming analysis, a portable gas analysis system was designed, constructed and tested for use in large-scale fire tests as a simpler and more robust alternative to online FTIR measurements. The novel equipment aimed to be easily portable and able to run on battery or mains electricity; be able to be calibrated at the test site; be capable of quantifying CO, CO2, O2, HCN, HBr, HCl, NOx and SO2 accurately and reliably; be capable of independent data logging; be capable of automated switchover of 7 bubblers; be able to withstand fire effluents; be simple to operate; allow individual bubbler times to be pre-set; be capable of being controlled remotely. To test the analysers functionality, it was used alongside the ISO/TS 19700 Steady State Tube Furnace (SSTF). A series of tests were conducted to assess the validity of the box analyser measurements and the data logging abilities of the apparatus. PMMA and PA 6.6 were used to assess the validity of the box analyser measurements. The data obtained from the bench-scale assessments showed excellent agreement. Following this, the portable analyser was used to monitor gas concentrations during large-scale testing using the ISO 9705 room corner test. The analyser was set up, calibrated and set to record smoke toxicity measurements in the doorway of the test room. The analyser was successful in operating without manual interference and successfully recorded data for 12 of the 12 tests conducted in the ISO room tests. At the end of each test, the analyser created a data file (formatted as .csv) containing the measured gas concentrations throughout the test, which do not require specialist knowledge to interpret. This validated the portable analyser’s ability to monitor fire effluent without operator intervention on both a bench and large-scale. The portable analyser is a validated and significantly more practical alternative to FTIR, proven to work for large-scale fire testing for quantification of smoke toxicity. The analyser is a cheaper, more accessible option to assess smoke toxicity, mitigating the need for expensive equipment and specialist operators.Keywords: smoke toxicity, large-scale tests, iso 9705, analyser, novel equipment
Procedia PDF Downloads 781487 Efficient Storage and Intelligent Retrieval of Multimedia Streams Using H. 265
Authors: S. Sarumathi, C. Deepadharani, Garimella Archana, S. Dakshayani, D. Logeshwaran, D. Jayakumar, Vijayarangan Natarajan
Abstract:
The need of the hour for the customers who use a dial-up or a low broadband connection for their internet services is to access HD video data. This can be achieved by developing a new video format using H. 265. This is the latest video codec standard developed by ISO/IEC Moving Picture Experts Group (MPEG) and ITU-T Video Coding Experts Group (VCEG) on April 2013. This new standard for video compression has the potential to deliver higher performance than the earlier standards such as H. 264/AVC. In comparison with H. 264, HEVC offers a clearer, higher quality image at half the original bitrate. At this lower bitrate, it is possible to transmit high definition videos using low bandwidth. It doubles the data compression ratio supporting 8K Ultra HD and resolutions up to 8192×4320. In the proposed model, we design a new video format which supports this H. 265 standard. The major areas of applications in the coming future would lead to enhancements in the performance level of digital television like Tata Sky and Sun Direct, BluRay Discs, Mobile Video, Video Conferencing and Internet and Live Video streaming.Keywords: access HD video, H. 265 video standard, high performance, high quality image, low bandwidth, new video format, video streaming applications
Procedia PDF Downloads 3551486 Algorithm for Information Retrieval Optimization
Authors: Kehinde K. Agbele, Kehinde Daniel Aruleba, Eniafe F. Ayetiran
Abstract:
When using Information Retrieval Systems (IRS), users often present search queries made of ad-hoc keywords. It is then up to the IRS to obtain a precise representation of the user’s information need and the context of the information. This paper investigates optimization of IRS to individual information needs in order of relevance. The study addressed development of algorithms that optimize the ranking of documents retrieved from IRS. This study discusses and describes a Document Ranking Optimization (DROPT) algorithm for information retrieval (IR) in an Internet-based or designated databases environment. Conversely, as the volume of information available online and in designated databases is growing continuously, ranking algorithms can play a major role in the context of search results. In this paper, a DROPT technique for documents retrieved from a corpus is developed with respect to document index keywords and the query vectors. This is based on calculating the weight (Keywords: information retrieval, document relevance, performance measures, personalization
Procedia PDF Downloads 2421485 "Revolutionizing Geographic Data: CADmapper's Automated Precision in CAD Drawing Transformation"
Authors: Toleen Alaqqad, Kadi Alshabramiy, Suad Zaafarany, Basma Musallam
Abstract:
CADmapper is a significant tool of software for transforming geographic data into realistic CAD drawings. It speeds up and simplifies the conversion process by automating it. This allows architects, urban planners, engineers, and geographic information system (GIS) experts to solely concentrate on the imaginative and scientific parts of their projects. While the future incorporation of AI has the potential for further improvements, CADmapper's current capabilities make it an indispensable asset in the business. It covers a combination of 2D and 3D city and urban area models. The user can select a specific square section of the map to view, and the fee is based on the dimensions of the area being viewed. The procedure is straightforward: you choose the area you want, then pick whether or not to include topography. 3D architectural data (if available), followed by selecting whatever design program or CAD style you want to publish the document which contains more than 200 free broad town plans in DXF format. If you desire to specify a bespoke area, it's free up to 1 km2.Keywords: cadmaper, gdata, 2d and 3d data conversion, automated cad drawing, urban planning software
Procedia PDF Downloads 681484 Designing, Manufacturing and Testing a Portable Tractor Unit Biocoal Harvester Combine of Agriculture and Animal Wastes
Authors: Ali Moharrek, Hosein Mobli, Ali Jafari, Ahmad Tabataee Far
Abstract:
Biomass is a material generally produced by plants living on soil or water and their derivatives. The remains of agricultural and forest products contain biomass which is changeable into fuel. Besides, you can obtain biogas and ethanol from the charcoal produced from biomass through specific actions. this technology was designed for as a useful Native Fuel and Technology in Energy disasters Management Due to the sudden interruption of the flow of heat energy One of the problems confronted by mankind in the future is the limitations of fossil energy which necessitates production of new energies such as biomass. In order to produce biomass from the remains of the plants, different methods shall be applied considering factors like cost of production, production technology, area of requirement, speed of work easy utilization, ect. In this article we are focusing on designing a biomass briquetting portable machine. The speed of installation of the machine on a tractor is estimated as 80 MF 258. Screw press is used in designing this machine. The needed power for running this machine which is estimated as 17.4 kW is provided by the power axis of tractor. The pressing speed of the machine is considered to be 375 RPM Finally the physical and mechanical properties of the product were compared with utilized material which resulted in appropriate outcomes. This machine is designed for Gathering Raw materials of the ground by Head Section. During delivering the raw materials to Briquetting section, they Crushed, Milled & Pre Heated in Transmission section. This machine is a Combine Portable Tractor unit machine and can use all type of Agriculture, Forest & Livestock Animals Resides as Raw material to make Bio fuel. The Briquetting Section was manufactured and it successfully made bio fuel of Sawdust. Also this machine made a biofuel with Ethanol of sugarcane Wastes. This Machine is using P.T.O power source for Briquetting and Hydraulic Power Source for Pre Processing of Row Materials.Keywords: biomass, briquette, screw press, sawdust, animal wastes, portable, tractors
Procedia PDF Downloads 3161483 Current Status and Future Trends of Mechanized Fruit Thinning Devices and Sensor Technology
Authors: Marco Lopes, Pedro D. Gaspar, Maria P. Simões
Abstract:
This paper reviews the different concepts that have been investigated concerning the mechanization of fruit thinning as well as multiple working principles and solutions that have been developed for feature extraction of horticultural products, both in the field and industrial environments. The research should be committed towards selective methods, which inevitably need to incorporate some kinds of sensor technology. Computer vision often comes out as an obvious solution for unstructured detection problems, although leaves despite the chosen point of view frequently occlude fruits. Further research on non-traditional sensors that are capable of object differentiation is needed. Ultrasonic and Near Infrared (NIR) technologies have been investigated for applications related to horticultural produce and show a potential to satisfy this need while simultaneously providing spatial information as time of flight sensors. Light Detection and Ranging (LIDAR) technology also shows a huge potential but it implies much greater costs and the related equipment is usually much larger, making it less suitable for portable devices, which may serve a purpose on smaller unstructured orchards. Portable devices may serve a purpose on these types of orchards. In what concerns sensor methods, on-tree fruit detection, major challenge is to overcome the problem of fruits’ occlusion by leaves and branches. Hence, nontraditional sensors capable of providing some type of differentiation should be investigated.Keywords: fruit thinning, horticultural field, portable devices, sensor technologies
Procedia PDF Downloads 1401482 Portable Water Treatment for Flood Resilience
Authors: Alireza Abbassi Monjezi, Mohammad Hasan Shaheed
Abstract:
Flood, caused by excessive rainfall, monsoon, cyclone and tsunami is a common disaster in many countries of the world especially sea connected low-lying countries. A stand-alone self-powered water filtration module for decontamination of floodwater has been designed and modeled. A combination forward osmosis – low pressure reverse osmosis (FO-LPRO) system powered by solar photovoltaic-thermal (PVT) energy is investigated which could overcome the main barriers to water supply for remote areas and ensure off-grid filtration. The proposed system is designed to be small scale and portable to provide on-site potable water to communities that are no longer themselves mobile nor can be reached quickly by the aid agencies. FO is an osmotically driven process that uses osmotic pressure gradients to drive water across a controlled pore membrane from a feed solution (low osmotic pressure) to a draw solution (high osmotic pressure). This drops the demand for high hydraulic pressures and therefore the energy demand. There is also a tendency for lower fouling, easier fouling layer removal and higher water recovery. In addition, the efficiency of the PVT unit will be maximized through freshwater cooling which is integrated into the system. A filtration module with the capacity of 5 m3/day is modeled to treat floodwater and provide drinking water. The module can be used as a tool for disaster relief, particularly in the aftermath of flood and tsunami events.Keywords: flood resilience, membrane desalination, portable water treatment, solar energy
Procedia PDF Downloads 2891481 Design of an Acoustic Imaging Sensor Array for Mobile Robots
Authors: Dibyendu Roy, V. Ramu Reddy, Parijat Deshpande, Ranjan Dasgupta
Abstract:
Imaging of underwater objects is primarily conducted by acoustic imagery due to the severe attenuation of electro-magnetic waves in water. Acoustic imagery underwater has varied range of significant applications such as side-scan sonar, mine hunting sonar. It also finds utility in other domains such as imaging of body tissues via ultrasonography and non-destructive testing of objects. In this paper, we explore the feasibility of using active acoustic imagery in air and simulate phased array beamforming techniques available in literature for various array designs to achieve a suitable acoustic sensor array design for a portable mobile robot which can be applied to detect the presence/absence of anomalous objects in a room. The multi-path reflection effects especially in enclosed rooms and environmental noise factors are currently not simulated and will be dealt with during the experimental phase. The related hardware is designed with the same feasibility criterion that the developed system needs to be deployed on a portable mobile robot. There is a trade of between image resolution and range with the array size, number of elements and the imaging frequency and has to be iteratively simulated to achieve the desired acoustic sensor array design. The designed acoustic imaging array system is to be mounted on a portable mobile robot and targeted for use in surveillance missions for intruder alerts and imaging objects during dark and smoky scenarios where conventional optic based systems do not function well.Keywords: acoustic sensor array, acoustic imagery, anomaly detection, phased array beamforming
Procedia PDF Downloads 4091480 Data Gathering and Analysis for Arabic Historical Documents
Authors: Ali Dulla
Abstract:
This paper introduces a new dataset (and the methodology used to generate it) based on a wide range of historical Arabic documents containing clean data simple and homogeneous-page layouts. The experiments are implemented on printed and handwritten documents obtained respectively from some important libraries such as Qatar Digital Library, the British Library and the Library of Congress. We have gathered and commented on 150 archival document images from different locations and time periods. It is based on different documents from the 17th-19th century. The dataset comprises differing page layouts and degradations that challenge text line segmentation methods. Ground truth is produced using the Aletheia tool by PRImA and stored in an XML representation, in the PAGE (Page Analysis and Ground truth Elements) format. The dataset presented will be easily available to researchers world-wide for research into the obstacles facing various historical Arabic documents such as geometric correction of historical Arabic documents.Keywords: dataset production, ground truth production, historical documents, arbitrary warping, geometric correction
Procedia PDF Downloads 1691479 Cepstrum Analysis of Human Walking Signal
Authors: Koichi Kurita
Abstract:
In this study, we propose a real-time data collection technique for the detection of human walking motion from the charge generated on the human body. This technique is based on the detection of a sub-picoampere electrostatic induction current, generated by the motion, flowing through the electrode of a wireless portable sensor attached to the subject. An FFT analysis of the wave-forms of the electrostatic induction currents generated by the walking motions showed that the currents generated under normal and restricted walking conditions were different. Moreover, we carried out a cepstrum analysis to detect any differences in the walking style. Results suggest that a slight difference in motion, either due to the individual’s gait or a splinted leg, is directly reflected in the electrostatic induction current generated by the walking motion. The proposed wireless portable sensor enables the detection of even subtle differences in walking motion.Keywords: human walking motion, motion measurement, current measurement, electrostatic induction
Procedia PDF Downloads 3441478 Portable Hands-Free Process Assistant for Gas Turbine Maintenance
Authors: Elisabeth Brandenburg, Robert Woll, Rainer Stark
Abstract:
This paper presents how smart glasses and voice commands can be used for improving the maintenance process of industrial gas turbines. It presents the process of inspecting a gas turbine’s combustion chamber and how it is currently performed using a set of paper-based documents. In order to improve this process, a portable hands-free process assistance system has been conceived. In the following, it will be presented how the approach of user-centered design and the method of paper prototyping have been successfully applied in order to design a user interface and a corresponding workflow model that describes the possible interaction patterns between the user and the interface. The presented evaluation of these results suggests that the assistance system could help the user by rendering multiple manual activities obsolete, thus allowing him to work hands-free and to save time for generating protocols.Keywords: paper prototyping, smart glasses, turbine maintenance, user centered design
Procedia PDF Downloads 3241477 Human-Computer Interaction: Strategies for Ensuring the Design of User-Centered Web Interfaces for Smartphones
Authors: Byron Joseph A. Hallar, Annjeannette Alain D. Galang, Maria Visitacion N. Gumabay
Abstract:
The widespread adoption and increasing proliferation of smartphones that started during the first decade of the twenty-first century have enabled their users to communicate and access information in ways that were merely thought of as possibilities in the few years before the smartphone revolution. A product of the convergence of the cellular phone and portable computer, the smartphone provides an additional important function that used to be the exclusive domain of desktop-bound computers and portable computers: Web Browsing. For increasing numbers of users, the smartphone and allied devices such as tablet computers have become their first and often their only means of accessing the World Wide Web. This has led to the development of websites that cater to the needs of the new breed of smartphone-carrying web users. The smaller size of smartphones as compared with conventional computers has provided unique challenges to web interface designers. The smaller screen size and touchscreen interface have made it much more difficult to read and navigate through web pages that were in most part designed for traditional desktop and portable computers. Although increasing numbers of websites now provide an alternate website formatted for smartphones, problems with ease of use, reliability and usability still remain. This study focuses on the identification of the problems associated with smartphone web interfaces, the compliance with accepted standards of user-oriented web interface design, the strategies that could be utilized to ensure the design of user-centric web interfaces for smartphones, and the identification of the current trends and developments related to user-centric web interface design intended for the consumption of smartphone users.Keywords: human-computer interaction, user-centered design, web interface, mobile, smartphone
Procedia PDF Downloads 3571476 Mobile Microscope for the Detection of Pathogenic Cells Using Image Processing
Authors: P. S. Surya Meghana, K. Lingeshwaran, C. Kannan, V. Raghavendran, C. Priya
Abstract:
One of the most basic and powerful tools in all of science and medicine is the light microscope, the fundamental device for laboratory as well as research purposes. With the improving technology, the need for portable, economic and user-friendly instruments is in high demand. The conventional microscope fails to live up to the emerging trend. Also, adequate access to healthcare is not widely available, especially in developing countries. The most basic step towards the curing of a malady is the diagnosis of the disease itself. The main aim of this paper is to diagnose Malaria with the most common device, cell phones, which prove to be the immediate solution for most of the modern day needs with the development of wireless infrastructure allowing to compute and communicate on the move. This opened up the opportunity to develop novel imaging, sensing, and diagnostics platforms using mobile phones as an underlying platform to address the global demand for accurate, sensitive, cost-effective, and field-portable measurement devices for use in remote and resource-limited settings around the world.Keywords: cellular, hand-held, health care, image processing, malarial parasites, microscope
Procedia PDF Downloads 2671475 Hindi Speech Synthesis by Concatenation of Recognized Hand Written Devnagri Script Using Support Vector Machines Classifier
Authors: Saurabh Farkya, Govinda Surampudi
Abstract:
Optical Character Recognition is one of the current major research areas. This paper is focussed on recognition of Devanagari script and its sound generation. This Paper consists of two parts. First, Optical Character Recognition of Devnagari handwritten Script. Second, speech synthesis of the recognized text. This paper shows an implementation of support vector machines for the purpose of Devnagari Script recognition. The Support Vector Machines was trained with Multi Domain features; Transform Domain and Spatial Domain or Structural Domain feature. Transform Domain includes the wavelet feature of the character. Structural Domain consists of Distance Profile feature and Gradient feature. The Segmentation of the text document has been done in 3 levels-Line Segmentation, Word Segmentation, and Character Segmentation. The pre-processing of the characters has been done with the help of various Morphological operations-Otsu's Algorithm, Erosion, Dilation, Filtration and Thinning techniques. The Algorithm was tested on the self-prepared database, a collection of various handwriting. Further, Unicode was used to convert recognized Devnagari text into understandable computer document. The document so obtained is an array of codes which was used to generate digitized text and to synthesize Hindi speech. Phonemes from the self-prepared database were used to generate the speech of the scanned document using concatenation technique.Keywords: Character Recognition (OCR), Text to Speech (TTS), Support Vector Machines (SVM), Library of Support Vector Machines (LIBSVM)
Procedia PDF Downloads 5001474 Diagnosis and Management of Obesity Among South Asians: A Paradigm
Authors: Deepa Vasudevan, Thomas Northrup, Angela Stotts, Michelle Klawans
Abstract:
To date, we have conducted three studies on this subject. The research done to date is through three studies. The initial study was to document that modified criteria independently identified higher numbers of overweight/obese South Asian Indians. The second study was to document physician knowledge of appropriate diagnosis of obesity among South Asian Indians. The final study was an intervention to evaluate the efficacy of a training module on improving physician diagnosis and counseling of overweight/obese Asian patients.Keywords: South Asian Indians, obesity, physicians, BMI and waist circumference
Procedia PDF Downloads 4081473 Developing a Rational Database Management System (RDBMS) Supporting Product Life Cycle Appications
Authors: Yusri Yusof, Chen Wong Keong
Abstract:
This paper presents the implementation details of a Relational Database Management System of a STEP-technology product model repository. It is able support the implementation of any EXPRESS language schema, although it has been primarily implemented to support mechanical product life cycle applications. This database support the input of STEP part 21 file format from CAD in geometrical and topological data format and support a range of queries for mechanical product life cycle applications. This proposed relational database management system uses entity-to-table method (R1) rather than type-to-table method (R4). The two mapping methods have their own strengths and drawbacks.Keywords: RDBMS, CAD, ISO 10303, part-21 file
Procedia PDF Downloads 5371472 Portable Glove Controlled Video Game for Hand Rehabilitation
Authors: Vinesh Janarthanan, Mohammad H. Rahman
Abstract:
There are numerous neurological conditions that may result in a loss of motor function. Such conditions may include cerebral palsy, Parkinson’s disease, stroke or multiple sclerosis. Due to impaired motor function, specifically in the hand and arm, living independently becomes tremendously more difficult. Rehabilitation programs are the main method to treat these kinds of disabled individuals. However, these programs require longtime commitment from the clinicians/therapists, demand person to person caring, and typically the treatment duration is usually very long. Aside from the treatment received from the therapist, the continuation of neuroplasticity at home is essential to maximizing development and restoring the biological function. To contribute in this area, we have researched and developed a portable and comfortable hand glove for fine motor skills rehabilitation. The glove provides interactive home-based therapy to engage the patient with simple games. The key to this treatment is the repetition of moving the hand and being capable of positioning the hand in various ways.Keywords: home based, wearable sensors, glove, rehabilitation, motor function, video games
Procedia PDF Downloads 1481471 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 311470 Multimodal Database of Retina Images for Africa: The First Open Access Digital Repository for Retina Images in Sub Saharan Africa
Authors: Simon Arunga, Teddy Kwaga, Rita Kageni, Michael Gichangi, Nyawira Mwangi, Fred Kagwa, Rogers Mwavu, Amos Baryashaba, Luis F. Nakayama, Katharine Morley, Michael Morley, Leo A. Celi, Jessica Haberer, Celestino Obua
Abstract:
Purpose: The main aim for creating the Multimodal Database of Retinal Images for Africa (MoDRIA) was to provide a publicly available repository of retinal images for responsible researchers to conduct algorithm development in a bid to curb the challenges of ophthalmic artificial intelligence (AI) in Africa. Methods: Data and retina images were ethically sourced from sites in Uganda and Kenya. Data on medical history, visual acuity, ocular examination, blood pressure, and blood sugar were collected. Retina images were captured using fundus cameras (Foru3-nethra and Canon CR-Mark-1). Images were stored on a secure online database. Results: The database consists of 7,859 retinal images in portable network graphics format from 1,988 participants. Images from patients with human immunodeficiency virus were 18.9%, 18.2% of images were from hypertensive patients, 12.8% from diabetic patients, and the rest from normal’ participants. Conclusion: Publicly available data repositories are a valuable asset in the development of AI technology. Therefore, is a need for the expansion of MoDRIA so as to provide larger datasets that are more representative of Sub-Saharan data.Keywords: retina images, MoDRIA, image repository, African database
Procedia PDF Downloads 1291469 Towards Law Data Labelling Using Topic Modelling
Authors: Daniel Pinheiro Da Silva Junior, Aline Paes, Daniel De Oliveira, Christiano Lacerda Ghuerren, Marcio Duran
Abstract:
The Courts of Accounts are institutions responsible for overseeing and point out irregularities of Public Administration expenses. They have a high demand for processes to be analyzed, whose decisions must be grounded on severity laws. Despite the existing large amount of processes, there are several cases reporting similar subjects. Thus, previous decisions on already analyzed processes can be a precedent for current processes that refer to similar topics. Identifying similar topics is an open, yet essential task for identifying similarities between several processes. Since the actual amount of topics is considerably large, it is tedious and error-prone to identify topics using a pure manual approach. This paper presents a tool based on Machine Learning and Natural Language Processing to assists in building a labeled dataset. The tool relies on Topic Modelling with Latent Dirichlet Allocation to find the topics underlying a document followed by Jensen Shannon distance metric to generate a probability of similarity between documents pairs. Furthermore, in a case study with a corpus of decisions of the Rio de Janeiro State Court of Accounts, it was noted that data pre-processing plays an essential role in modeling relevant topics. Also, the combination of topic modeling and a calculated distance metric over document represented among generated topics has been proved useful in helping to construct a labeled base of similar and non-similar document pairs.Keywords: courts of accounts, data labelling, document similarity, topic modeling
Procedia PDF Downloads 1801468 The Rigor and Relevance of the Mathematics Component of the Teacher Education Programmes in Jamaica: An Evaluative Approach
Authors: Avalloy McCarthy-Curvin
Abstract:
For over fifty years there has been widespread dissatisfaction with the teaching of Mathematics in Jamaica. Studies, done in the Jamaican context highlight that teachers at the end of training do not have a deep understanding of the mathematics content they teach. Little research has been done in the Jamaican context that targets the advancement of contextual knowledge on the problem to ultimately provide a solution. The aim of the study is to identify what influences this outcome of teacher education in Jamaica so as to remedy the problem. This study formatively evaluated the curriculum documents, assessments and the delivery of the curriculum that are being used in teacher training institutions in Jamaica to determine their rigor -the extent to which written document, instruction, and the assessments focused on enabling pre-service teachers to develop deep understanding of mathematics and relevance- the extent to which the curriculum document, instruction, and the assessments are focus on developing the requisite knowledge for teaching mathematics. The findings show that neither the curriculum document, instruction nor assessments ensure rigor and enable pre-service teachers to develop the knowledge and skills they need to teach mathematics effectively.Keywords: relevance, rigor, deep understanding, formative evaluation
Procedia PDF Downloads 2401467 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 176