Search results for: procedure entry array
2772 A Study of Influence of Freezing on Mechanical Properties of Tendon Fascicles
Authors: Martyna Ekiert, Andrzej Mlyniec
Abstract:
Tendons are the biological structures, which primary function is to transfer force generated by muscles to the bones. Unfortunately, damages of tendons are also one of the most common injuries of the human musculoskeletal system. For the most severe cases of tendon rupture, such as the tear of calcaneus tendon or anterior cruciate ligament of the knee, a surgical procedure is the only possible way of full recovery. Tendons used as biological grafts are usually subjected to the process of deep freezing and subsequent thawing. This, in particular for multiple freezing/thawing cycles, may result in changes of tendon internal structure causing deterioration of mechanical properties of the tissue. Therefore, studies on the influence of freezing on tendons biomechanics, including internal water content in soft tissue, seems to be greatly needed. An experimental study of the influence of freezing on mechanical properties of the tendon was performed on fascicles samples dissected form bovine flexor tendons. The preparation procedure was performed with the presence of 0.9% saline solution in order to prevent an excessive tissue drying. All prepared samples were subjected to the different number of freezing/thawing cycles. For freezing part of the protocol we used -80°C temperature while for slow thawing we used fridge temperature (4°C) combined with equalizing temperatures in the standard state (25°C). After final thawing, the mechanical properties of each sample was examined using cyclic loading test. Our results may contribute for better understanding of negative effects of soft tissues freezing, resulting from abnormal thermal expansion of water. This also may help to determine the limit of freezing/thawing cycles disqualifying tissue for surgical purposes and thus help optimize tissues storage conditions.Keywords: freezing, soft tissue, tendon, bovine fascicles
Procedia PDF Downloads 2182771 Antecedents of Spinouts: Technology Relatedness, Intellectual Property Rights, and Venture Capital
Authors: Sepideh Yeganegi, Andre Laplume, Parshotam Dass, Cam-Loi Huynh
Abstract:
This paper empirically examines organizational and institutional antecedents of entrepreneurial entry. We employ multi-level logistic regression modelling methods on a sub-sample of the Global Entrepreneurship Monitor’s 2011 survey covering 30 countries. The results reveal that employees who have experience with activities unrelated to the core technology of their organizations are more likely to spin out entrepreneurial ventures, whereas those with experiences related to the core technology are less likely to do so. In support of the recent theory, we find that the strength of intellectual property rights and the availability of venture capital have negative and positive effects, respectively, on the likelihood that employees turn into entrepreneurs. These institutional factors also moderate the effect of relatedness to core technology such that entrepreneurial entries by employees with experiences related to core technology are curbed more severely by stronger intellectual property rights protection regimes and lack of venture capital.Keywords: spinouts, intellectual property rights, venture capital, entrepreneurship, organizational experiences, core technology
Procedia PDF Downloads 3562770 Polypropylene Fibres Dyeable with Acid Dyes
Authors: H. M. Wang, C. J. Chang
Abstract:
As the threat of global climate change is more seriously, "net zero emissions by 2050" has become a common global goal. In order to reduce the consumption of petrochemical raw materials and reduce carbon emissions, low-carbon fiber materials have become key materials in the future global textile supply chain. This project uses polyolefin raw materials to modify through synthesis and amination to develop low-temperature dyeable polypropylene fibers, endow them with low-temperature dyeability and high color fastness that can be combined with acid dyes, and improve the problem of low coloring strength. The color fastness to washing can reach the requirement of commerce with 3.5 level or more. Therefore, we realize the entry of polypropylene fiber into the clothing textile supply chain, replace existing fiber raw materials, solve the problem of domestic chemical fiber, textile, and clothing industry's plight of no low-carbon alternative new material sources, and provide the textile industry with a solution to achieve the goal of net zero emissions in 2050.Keywords: acid dyes, dyeing, low-temperature, polypropylene fiber
Procedia PDF Downloads 862769 Effect of Air Gap Distance on the Structure of PVDF Hollow Fiber Membrane Contactors for Physical CO2 Absorption
Authors: J. Shiri, A. Mansourizadeh, F. Faghih, H. Vaez
Abstract:
In this study, porous polyvinylidene fluoride (PVDF) hollow fiber membranes are fabricated via a wet phase-inversion Process and used in the gas–liquid membrane contactor for physical CO2 absorption. Effect of different air gap on the structure and CO2 flux of the membrane was investigated. The hollow fibers were prepared using the wet spinning process using a dope solution containing PVDF/NMP/Licl (18%, 78%, 4%) at the extrusion rate of 4.5ml/min and air gaps of 0, 7, 15cm. Water was used as internal and external coagulants. Membranes were characterized using various techniques such as Field Emission Scanning Electron Microscopy (FESEM), Gas permeation test, Critical Water Entry Pressure (CEPw) to select the best membrane structure for Co2 absorption. The characterization results showed that the prepared membrane at which air gap possess small pore size with high surface porosity and wetting resistance, which are favorable for gas absorption application air gap increased, CEPw had a decrease, but the N2 permeation was decreased. Surface porosity and also Co2 absorption was increased.Keywords: porous PVDF hollow fiber membrane, CO2 absorption, phase inversion, air gap
Procedia PDF Downloads 3892768 In Situ Volume Imaging of Cleared Mice Seminiferous Tubules Opens New Window to Study Spermatogenic Process in 3D
Authors: Lukas Ded
Abstract:
Studying the tissue structure and histogenesis in the natural, 3D context is challenging but highly beneficial process. Contrary to classical approach of the physical tissue sectioning and subsequent imaging, it enables to study the relationships of individual cellular and histological structures in their native context. Recent developments in the tissue clearing approaches and microscopic volume imaging/data processing enable the application of these methods also in the areas of developmental and reproductive biology. Here, using the CLARITY tissue procedure and 3D confocal volume imaging we optimized the protocol for clearing, staining and imaging of the mice seminiferous tubules isolated from the testes without cardiac perfusion procedure. Our approach enables the high magnification and fine resolution axial imaging of the whole diameter of the seminiferous tubules with possible unlimited lateral length imaging. Hence, the large continuous pieces of the seminiferous tubule can be scanned and digitally reconstructed for the study of the single tubule seminiferous stages using nuclear dyes. Furthermore, the application of the antibodies and various molecular dyes can be used for molecular labeling of individual cellular and subcellular structures and resulting 3D images can highly increase our understanding of the spatiotemporal aspects of the seminiferous tubules development and sperm ultrastructure formation. Finally, our newly developed algorithms for 3D data processing enable the massive parallel processing of the large amount of individual cell and tissue fluorescent signatures and building the robust spermatogenic models under physiological and pathological conditions.Keywords: CLARITY, spermatogenesis, testis, tissue clearing, volume imaging
Procedia PDF Downloads 1352767 Characterisation of Pasteurella multocida from Asymptomatic Animals
Authors: Rajeev Manhas, M. A. Bhat, A. K. Taku, Dalip Singh, Deep Shikha, Gulzar Bader
Abstract:
The study was aimed to understand the distribution of various serogroups of Pasteurella multocida in bovines, small ruminants, pig, rabbit, and poultry from Jammu, Jammu and Kashmir and to characterize the isolates with respect to LPS synthesizing genes, dermonecrotic toxin gene (toxA) gene and antibiotic resistance. For isolation, the nasopharyngeal swab procedure appeared to be better than nasal swab procedure, particularly in ovine and swine. Out of 200 samples from different animals, isolation of P. multocida could be achieved from pig and sheep (5 each) and from poultry and buffalo (2 each) samples only, which accounted for 14 isolates. Upon molecular serogrouping, 3 isolates from sheep and 2 isolates from poultry were found as serogroup A, 2 isolates from buffalo were confirmed as serogroup B and 5 isolates from pig were found to belong to serogroup D. However, 2 isolates from sheep could not be typed, hence, untypable. All the 14 isolates were subjected to mPCR genotyping. A total of 10 isolates, 5 each from pig and sheep, generated an amplicon specific to genotype L6 and L6 indicates Heddleston serovars 10, 11, 12 and 15. Similarly, 2 isolates from bovines generated an amplicon of genotype L2 which indicates Heddleston serovar 2/5. However, 2 isolates from poultry generated specific amplicon with L1 signifying Heddleston serovar 1, but these isolates also produced multiple bands with primer L5. Only, one isolate of capsular type A from sheep possessed the structural gene, toxA for dermonecrotoxin. There was variability in the antimicrobial susceptibility pattern in sheep isolates, but overall the rate of tetracycline resistance was relatively high (64.28%) in our strains while all the isolates were sensitive to streptomycin. Except for the swine isolates and one toxigenic sheep isolate, the P. multocida isolates from this study were sensitive to quinolones. Although the level of resistance to commercial antibiotics was generally low, the use of tetracycline and erythromycin was not recommended.Keywords: antibiogram, genotyping, Pasteurella multocida, serogrouping, toxA
Procedia PDF Downloads 4522766 Research on the Landscape of Xi'an Ancient City Based on the Poetry Text of Tang Dynasty
Authors: Zou Yihui
Abstract:
The integration of the traditional landscape of the ancient city and the poet's emotions and symbolization into ancient poetry is the unique cultural gene and spiritual core of the historical city, and re-understanding the historical landscape pattern from the poetry is conducive to continuing the historical city context and improving the current situation of the gradual decline of the poetry of the modern historical urban landscape. Starting from Tang poetry uses semantic analysis methods、combined with text mining technology, entry mining, word frequency analysis, and cluster analysis of the landscape information of Tang Chang'an City were carried out, and the method framework for analyzing the urban landscape form based on poetry text was constructed. Nearly 160 poems describing the landscape of Tang Chang'an City were screened, and the poetic landscape characteristics of Tang Chang'an City were sorted out locally in order to combine with modern urban spatial development to continue the urban spatial context.Keywords: Tang Chang'an City, poetic texts, semantic analysis, historical landscape
Procedia PDF Downloads 612765 Grid Tied Photovoltaic Power on School Roof
Authors: Yeong-cheng Wang, Jin-Yinn Wang, Ming-Shan Lin, Jian-Li Dong
Abstract:
To universalize the adoption of sustainable energy, the R.O.C. government encourages public buildings to introduce the PV power station on the building roof, whereas most old buildings did not include the considerations of photovoltaic (PV) power facilities in the design phase. Several factors affect the PV electricity output, the temperature is the key one, different PV technologies have different temperature coefficients. Other factors like PV panel azimuth, panel inclination from the horizontal plane, and row to row distance of PV arrays, mix up at the beginning of system design. The goal of this work is to maximize the annual energy output of a roof mount PV system. Tables to simplify the design work are developed; the results can be used for engineering project quote directly.Keywords: optimal inclination, array azimuth, annual output
Procedia PDF Downloads 6752764 Understanding of the Impact of Technology in Collaborative Programming for Children
Authors: Nadia Selene Molina-Moreno, Maria Susana Avila-Garcia, Marco Bianchetti, Marcelina Pantoja-Flores
Abstract:
Visual Programming Tools available are a great tool for introducing children to programming and to develop a skill set for algorithmic thinking. On the other hand, collaborative learning and pair programming within the context of programming activities, has demonstrated to have social and learning benefits. However, some of the online tools available for programming for children are not designed to allow simultaneous and equitable participation of the team members since they allow only for a single control point. In this paper, a report the work conducted with children playing a user role is presented. A preliminary study to cull ideas, insights, and design considerations for a formal programming course for children aged 8-10 using collaborative learning as a pedagogical approach was conducted. Three setups were provided: 1) lo-fi prototype, 2) PC, 3) a 46' multi-touch single display groupware limited by the application to a single touch entry. Children were interviewed at the end of the sessions in order to know their opinions about teamwork and the different setups defined. Results are mixed regarding the setup, but they agree to like teamwork.Keywords: children, collaborative programming, visual programming, multi-touch tabletop, lo-fi prototype
Procedia PDF Downloads 3062763 On the Bias and Predictability of Asylum Cases
Authors: Panagiota Katsikouli, William Hamilton Byrne, Thomas Gammeltoft-Hansen, Tijs Slaats
Abstract:
An individual who demonstrates a well-founded fear of persecution or faces real risk of being subjected to torture is eligible for asylum. In Danish law, the exact legal thresholds reflect those established by international conventions, notably the 1951 Refugee Convention and the 1950 European Convention for Human Rights. These international treaties, however, remain largely silent when it comes to how states should assess asylum claims. As a result, national authorities are typically left to determine an individual’s legal eligibility on a narrow basis consisting of an oral testimony, which may itself be hampered by several factors, including imprecise language interpretation, insecurity or lacking trust towards the authorities among applicants. The leaky ground, on which authorities must assess their subjective perceptions of asylum applicants' credibility, questions whether, in all cases, adjudicators make the correct decision. Moreover, the subjective element in these assessments raises questions on whether individual asylum cases could be afflicted by implicit biases or stereotyping amongst adjudicators. In fact, recent studies have uncovered significant correlations between decision outcomes and the experience and gender of the assigned judge, as well as correlations between asylum outcomes and entirely external events such as weather and political elections. In this study, we analyze a publicly available dataset containing approximately 8,000 summaries of asylum cases, initially rejected, and re-tried by the Refugee Appeals Board (RAB) in Denmark. First, we look for variations in the recognition rates, with regards to a number of applicants’ features: their country of origin/nationality, their identified gender, their identified religion, their ethnicity, whether torture was mentioned in their case and if so, whether it was supported or not, and the year the applicant entered Denmark. In order to extract those features from the text summaries, as well as the final decision of the RAB, we applied natural language processing and regular expressions, adjusting for the Danish language. We observed interesting variations in recognition rates related to the applicants’ country of origin, ethnicity, year of entry and the support or not of torture claims, whenever those were made in the case. The appearance (or not) of significant variations in the recognition rates, does not necessarily imply (or not) bias in the decision-making progress. None of the considered features, with the exception maybe of the torture claims, should be decisive factors for an asylum seeker’s fate. We therefore investigate whether the decision can be predicted on the basis of these features, and consequently, whether biases are likely to exist in the decisionmaking progress. We employed a number of machine learning classifiers, and found that when using the applicant’s country of origin, religion, ethnicity and year of entry with a random forest classifier, or a decision tree, the prediction accuracy is as high as 82% and 85% respectively. tentially predictive properties with regards to the outcome of an asylum case. Our analysis and findings call for further investigation on the predictability of the outcome, on a larger dataset of 17,000 cases, which is undergoing.Keywords: asylum adjudications, automated decision-making, machine learning, text mining
Procedia PDF Downloads 922762 Drawing Building Blocks in Existing Neighborhoods: An Automated Pilot Tool for an Initial Approach Using GIS and Python
Authors: Konstantinos Pikos, Dimitrios Kaimaris
Abstract:
Although designing building blocks is a procedure used by many planners around the world, there isn’t an automated tool that will help planners and designers achieve their goals with lesser effort. The difficulty of the subject lies in the repeating process of manually drawing lines, while not only it is mandatory to maintain the desirable offset but to also achieve a lesser impact to the existing building stock. In this paper, using Geographical Information Systems (GIS) and the Python programming language, an automated tool integrated into ArcGIS PRO, is being presented. Despite its simplistic enviroment and the lack of specialized building legislation due to the complex state of the field, a planner who is aware of such technical information can use the tool to draw an initial approach of the final building blocks in an area with pre-existing buildings in an attempt to organize the usually sprawling suburbs of a city or any continuously developing area. The tool uses ESRI’s ArcPy library to handle the spatial data, while interactions with the user is made throught Tkinter. The main process consists of a modification of building edgescoordinates, using NumPy library, in an effort to draw the line of best fit, so the user can get the optimal results per block’s side. Finally, after the tool runs successfully, a table of primary planning information is shown, such as the area of the building block and its coverage rate. Regardless of the primary stage of the tool’s development, it is a solid base where potential planners with programming skills could invest, so they can make the tool adapt to their individual needs. An example of the entire procedure in a test area is provided, highlighting both the strengths and weaknesses of the final results.Keywords: arcPy, GIS, python, building blocks
Procedia PDF Downloads 1772761 STML: Service Type-Checking Markup Language for Services of Web Components
Authors: Saqib Rasool, Adnan N. Mian
Abstract:
Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.Keywords: REST, STML, type checking, web component
Procedia PDF Downloads 2512760 Evaluating Alternative Structures for Prefix Trees
Authors: Feras Hanandeh, Izzat Alsmadi, Muhammad M. Kwafha
Abstract:
Prefix trees or tries are data structures that are used to store data or index of data. The goal is to be able to store and retrieve data by executing queries in quick and reliable manners. In principle, the structure of the trie depends on having letters in nodes at the different levels to point to the actual words in the leafs. However, the exact structure of the trie may vary based on several aspects. In this paper, we evaluated different structures for building tries. Using datasets of words of different sizes, we evaluated the different forms of trie structures. Results showed that some characteristics may impact significantly, positively or negatively, the size and the performance of the trie. We investigated different forms and structures for the trie. Results showed that using an array of pointers in each level to represent the different alphabet letters is the best choice.Keywords: data structures, indexing, tree structure, trie, information retrieval
Procedia PDF Downloads 4502759 Understanding ICT Behaviors among Health Workers in Sub-Saharan Africa: A Cross-Sectional Study for Laboratory Persons in Uganda
Authors: M. Kasusse, M. Rosette, E. Burke, C. Mwangi, R. Batamwita, N. Tumwesigye, S. Aisu
Abstract:
A cross-sectional survey to ascertain the capacity of laboratory persons in using ICTs was conducted in 15 Ugandan districts (July-August 2013). A self-administered questionnaire served as data collection tool, interview guide and observation checklist. 69 questionnaires were filled, 12 interviews conducted, 45 HC observed. SPSS statistics 17.0 and SAS 9.2 software were used for entry and analyses. 69.35% of participants find it difficult to access a computer at work. Of the 30.65% who find it easy to access a computer at work, a significant 21.05% spend 0 hours on a computer daily. 60% of the participants cannot access internet at work. Of the 40% who have internet at work, a significant 20% lack email address but 20% weekly read emails weekly and 48% daily. It is viable/feasible to pilot informatics projects as strategies to build bridges develop skills for e-health landscape in laboratory services with a bigger financial muscle.Keywords: ICT behavior, clinical laboratory persons, Sub-Saharan Africa, Uganda
Procedia PDF Downloads 2292758 Lessons Learned from Ransomware-as-a-Service (RaaS) Organized Campaigns
Authors: Vitali Kremez
Abstract:
The researcher monitored an organized ransomware campaign in order to gain significant visibility into the tactics, techniques, and procedures employed by a campaign boss operating a ransomware scheme out of Russia. As the Russian hacking community lowered the access requirements for unsophisticated Russian cybercriminals to engage in ransomware campaigns, corporations and individuals face a commensurately greater challenge of effectively protecting their data and operations from being held ransom. This report discusses two notorious ransomware campaigns. Though the loss of data can be devastating, the findings demonstrate that sending ransom payments does not always help obtain data. Key learnings: 1. From the ransomware affiliate perspective, such campaigns have significantly lowered the barriers for entry for low-tier cybercriminals. 2. Ransomware revenue amounts are not as glamorous and fruitful as they are often publicly reported. Average ransomware crime bosses make only $90K per year on average. 3. Data gathered indicates that sending ransom payments does not always help obtain data. 4. The talk provides the complete payout structure and Bitcoin laundering operation related to the ransomware-as-a-service campaign.Keywords: bitcoin, cybercrime, ransomware, Russia
Procedia PDF Downloads 1952757 Optimisation of Photovoltaic Array with DC-DC Converter Groups
Authors: Fatma Soltani
Abstract:
In power electronics the DC-DC converters or choppers are now employed in large areas, particularly in the field of electricity generation by wind and solar energy conversion. Photovoltaic generators (GPV) can deliver maximum power for a point on the characteristic P = f (Vpv), called maximum power point (MPP), or climatic variations, entraiment fluctuation PPM. To remedy this problem is interposed between the generator and receiver a DC-DC converter. The converter is usually used a simple MOSFET chopper. However, the MOSFET can be applied in the field of low power when you need a high switching frequency but becomes highly dissipative when should block large voltages For PV generators medium and high power, the use of IGBT chopper is by far the most recommended. To reduce stress on semiconductor components using several choppers series connected in parallel is known as interleaved chopper. These choppers lead to rotas.Keywords: converter DC-DC entrelaced, photovoltaic generators, IGBT, optimisation
Procedia PDF Downloads 5372756 Model and Neural Control of the Depth of Anesthesia during Surgery
Authors: Javier Fernandez, Mayte Medina, Rafael Fernandez de Canete, Nuria Alcain, Juan Carlos Ramos-Diaz
Abstract:
At present, the experimentation of anesthetic drugs on patients requires a regulation protocol, and the response of each patient to several doses of entry drug must be well known. Therefore, the development of pharmacological dose control systems is a promising field of research in anesthesiology. In this paper, it has been developed a non-linear compartmental the pharmacokinetic-pharmacodynamical model which describes the anesthesia depth effect in a sufficiently reliable way over a set of patients with the depth effect quantified by the Bi-Spectral Index. Afterwards, an Artificial Neural Network (ANN) predictive controller has been designed based on the depth of anesthesia model so as to keep the patient in the optimum condition while he undergoes surgical treatment. For the purpose of quantifying the efficiency of the neural predictive controller, a classical proportional-integral-derivative controller has also been developed to compare both strategies. Results show the superior performance of predictive neural controller during BiSpectral Index reference tracking.Keywords: anesthesia, bi-spectral index, neural network control, pharmacokinetic-pharmacodynamical model
Procedia PDF Downloads 3342755 Hong Kong Artists Public Communication of Mental Health Disorders and Coping Techniques - Analysis
Authors: Patricia Portugal Marques de Carvalho Lourenco
Abstract:
Money, status, beauty, popularity, widespread public adulation, glitz and glamour portray a perfumed stress-free existence yet not every rock that glitters is a gold nugget and mental disorders are not an exclusivity of middle/low societal classes. Mental illnesses do not discriminate, and behind the superficial visual wealth of the upper-class, there are human beings who experience the ups and downs of life like any other person, except that publicly rather than privately and with an array of fingers pointing at them instead of a mere few. Sammi Cheung, Carina Lau, Fiona Sit, Kara Hui and Louis Cheung are a number of Hong Kong artists that have battled mental disorders, overcame them and used the process to openly discuss the still existing taboo.Keywords: mental disorders, mental health, public communication, depression, hong kong artists
Procedia PDF Downloads 2162754 Conceptual Design of Unmanned Aerial Targets
Authors: M. Adamski, J. Cwiklak
Abstract:
The contemporary battlefield creates a demand for more costly and highly advanced munitions. Training personnel responsible for operations, as well as an immediate execution of combat tasks, which engage real assets, is unrealistic and economically not feasible. Owing to a wide array of exploited simulators and various types of imitators, it is possible to reduce the costs. One of the effective elements of training, which can be applied in the training of all service branches, are imitators of aerial targets. This research serves as an introduction to the commencement of design analysis over a real aerial target imitator. Within the project, the basic aerodynamic calculations were made, which enabled to determine its geometry, design layout, performance, as well as the mass balance of individual components. The conducted calculations of the parameters of flight characteristics come closer to the real performance of such unmanned aerial vehicles.Keywords: aerial target, aerodynamics, imitator, performance
Procedia PDF Downloads 3952753 A Real-time Classification of Lying Bodies for Care Application of Elderly Patients
Authors: E. Vazquez-Santacruz, M. Gamboa-Zuniga
Abstract:
In this paper, we show a methodology for bodies classification in lying state using HOG descriptors and pressures sensors positioned in a matrix form (14 x 32 sensors) on the surface where bodies lie down. it will be done in real time. Our system is embedded in a care robot that can assist the elderly patient and medical staff around to get a better quality of life in and out of hospitals. Due to current technology a limited number of sensors is used, wich results in low-resolution data array, that will be used as image of 14 x 32 pixels. Our work considers the problem of human posture classification with few information (sensors), applying digital process to expand the original data of the sensors and so get more significant data for the classification, however, this is done with low-cost algorithms to ensure the real-time execution.Keywords: real-time classification, sensors, robots, health care, elderly patients, artificial intelligence
Procedia PDF Downloads 8642752 An Experimental Analysis of Squeeze Casting Parameters for 2017 a Wrought Al Alloy
Authors: Mohamed Ben Amar, Najib Souissi, Chedly Bradai
Abstract:
A Taguchi design investigation has been made into the relationship between the ductility and process variables in a squeeze cast 2017A wrought aluminium alloy. The considered process parameters were: squeeze pressure, melt temperature and die preheating temperature. An orthogonal array (OA), main effect, signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) are employed to analyze the effect of casting parameters. The results have shown that the selected parameters significantly affect the ductility of 2017A wrought Al alloy castings. Optimal squeeze cast process parameters were provided to illustrate the proposed approach and the results were proven to be trustworthy through practical experiments.Keywords: Taguchi method, squeeze casting, process parameters, ductility, microstructure
Procedia PDF Downloads 3982751 Incidence and Risk Factors of Traumatic Lumbar Puncture in Newborns in a Tertiary Care Hospital
Authors: Heena Dabas, Anju Paul, Suman Chaurasia, Ramesh Agarwal, M. Jeeva Sankar, Anurag Bajpai, Manju Saksena
Abstract:
Background: Traumatic lumbar puncture (LP) is a common occurrence and causes substantial diagnostic ambiguity. There is paucity of data regarding its epidemiology. Objective: To assess the incidence and risk factors of traumatic LP in newborns. Design/Methods: In a prospective cohort study, all inborn neonates admitted in NICU and planned to undergo LP for a clinical indication of sepsis were included. Neonates with diagnosed intraventricular hemorrhage (IVH) of grade III and IV were excluded. The LP was done by operator - often a fellow or resident assisted by bedside nurse. The unit has policy of not routinely using any sedation/analgesia during the procedure. LP is done by 26 G and 0.5-inch-long hypodermic needle inserted in third or fourth lumbar space while the infant is in lateral position. The infants were monitored clinically and by continuous measurement of vital parameters using multipara monitor during the procedure. The occurrence of traumatic tap along with CSF parameters and other operator and assistant characteristics were recorded at the time of procedure. Traumatic tap was defined as presence of visible blood or more than 500 red blood cells on microscopic examination. Microscopic trauma was defined when CSF is not having visible blood but numerous RBCs. The institutional ethics committee approved the study protocol. A written informed consent from the parents and the health care providers involved was obtained. Neonates were followed up till discharge/death and final diagnosis was assigned along with treating team. Results: A total of 362 (21%) neonates out of 1726 born at the hospital were admitted during the study period (July 2016 to January, 2017). Among these neonates, 97 (26.7%) were suspected of sepsis. A total of 54 neonates were enrolled who met the eligibility criteria and parents consented to participate in the study. The mean (SD) birthweight was 1536 (732) grams and gestational age 32.0 (4.0) weeks. All LPs were indicated for late onset sepsis at the median (IQR) age of 12 (5-39) days. The traumatic LP occurred in 19 neonates (35.1%; 95% C.I 22.6% to 49.3%). Frank blood was observed in 7 (36.8%) and in the remaining, 12(63.1%) CSF was detected to have microscopic trauma. The preliminary risk factor analysis including birth weight, gestational age and operator/assistant and other characteristics did not demonstrate clinically relevant predictors. Conclusion: A significant number of neonates requiring lumbar puncture in our study had high incidence of traumatic tap. We were not able to identify modifiable risk factors. There is a need to understand the reasons and further reduce this issue for improving management in NICUs.Keywords: incidence, newborn, traumatic, lumbar puncture
Procedia PDF Downloads 2942750 The Impact of the Number of Neurons in the Hidden Layer on the Performance of MLP Neural Network: Application to the Fast Identification of Toxics Gases
Authors: Slimane Ouhmad, Abdellah Halimi
Abstract:
In this work, we have applied neural networks method MLP type to a database from an array of six sensors for the detection of three toxic gases. As the choice of the number of hidden layers and the weight values has a great influence on the convergence of the learning algorithm, we proposed, in this article, a mathematical formulation to determine the optimal number of hidden layers and good weight values based on the method of back propagation of errors. The results of this modeling have improved discrimination of these gases on the one hand, and optimize the computation time on the other hand, the comparison to other results achieved in this case.Keywords: MLP Neural Network, back-propagation, number of neurons in the hidden layer, identification, computing time
Procedia PDF Downloads 3462749 Renewable Energy Interfaced Shunt Active Filter Using a Virtual Flux Direct Power Control
Authors: M. R. Bengourina, M. Rahli, L. Hassaine, S. Saadi
Abstract:
In this study, we present a control method entitled virtual flux direct power control of a grid connected photovoltaic system associated with an active power filter. The virtual flux direct control of power (VF-DPC) is employed for the calculation of reference current generation. In this technique, the switches states of inverter are selected from a table of switching based on the immediate errors between the active and reactive powers and their reference values. The objectives of this paper are the reduction of Total Harmonic Distortion (THD) of source current, compensating reactive power and injecting the maximum active power available from the PV array into the load and/or grid. MATLAB/SIMULINK simulations are provided to demonstrate the performance of the proposed approach.Keywords: shunt active power filter, VF-DPC, photovoltaic, MPPT
Procedia PDF Downloads 3212748 Evaluation of Tensile Strength of Natural Fibres Reinforced Epoxy Composites Using Fly Ash as Filler Material
Authors: Balwinder Singh, Veerpaul Kaur Mann
Abstract:
A composite material is formed by the combination of two or more phases or materials. Natural minerals-derived Basalt fiber is a kind of fiber being introduced in the polymer composite industry due to its good mechanical properties similar to synthetic fibers and low cost, environment friendly. Also, there is a rising trend towards the use of industrial wastes as fillers in polymer composites with the aim of improving the properties of the composites. The mechanical properties of the fiber-reinforced polymer composites are influenced by various factors like fiber length, fiber weight %, filler weight %, filler size, etc. Thus, a detailed study has been done on the characterization of short-chopped Basalt fiber-reinforced polymer matrix composites using fly ash as filler. Taguchi’s L9 orthogonal array has been used to develop the composites by considering fiber length (6, 9 and 12 mm), fiber weight % (25, 30 and 35 %) and filler weight % (0, 5 and 10%) as input parameters with their respective levels and a thorough analysis on the mechanical characteristics (tensile strength and impact strength) has been done using ANOVA analysis with the help of MINITAB14 software. The investigation revealed that fiber weight is the most significant parameter affecting tensile strength, followed by fiber length and fiber weight %, respectively, while impact characterization showed that fiber length is the most significant factor, followed by fly ash weight, respectively. Introduction of fly ash proved to be beneficial in both the characterization with enhanced values upto 5% fly ash weight. The present study on the natural fibres reinforced epoxy composites using fly ash as filler material to study the effect of input parameters on the tensile strength in order to maximize tensile strength of the composites. Fabrication of composites based on Taguchi L9 orthogonal array design of experiments by using three factors fibre type, fibre weight % and fly ash % with three levels of each factor. The Optimization of composition of natural fibre reinforces composites using ANOVA for obtaining maximum tensile strength on fabricated composites revealed that the natural fibres along with fly ash can be successfully used with epoxy resin to prepare polymer matrix composites with good mechanical properties. Paddy- Paddy fibre gives high elasticity to the fibre composite due to presence of approximately hexagonal structure of cellulose present in paddy fibre. Coir- Coir fibre gives less tensile strength than paddy fibre as Coir fibre is brittle in nature when it pulls breakage occurs showing less tensile strength. Banana- Banana fibre has the least tensile strength in comparison to the paddy & coir fibre due to less cellulose content. Higher fibre weight leads to reduction in tensile strength due to increased nuclei of air pockets. Increasing fly ash content reduces tensile strength due to nonbonding of fly ash particles with natural fibre. Fly ash is also not very strong as compared to the epoxy resin leading to reduction in tensile strength.Keywords: tensile strength and epoxy resin. basalt Fiber, taguchi, polymer matrix, natural fiber
Procedia PDF Downloads 472747 The Omani Learner of English Corpus: Source and Tools
Authors: Anood Al-Shibli
Abstract:
Designing a learner corpus is not an easy task to accomplish because dealing with learners’ language has many variables which might affect the results of any study based on learners’ language production (spoken and written). Also, it is very essential to systematically design a learner corpus especially when it is aimed to be a reference to language research. Therefore, designing the Omani Learner Corpus (OLEC) has undergone many explicit and systematic considerations. These criteria can be regarded as the foundation to design any learner corpus to be exploited effectively in language use and language learning studies. Added to that, OLEC is manually error-annotated corpus. Error-annotation in learner corpora is very essential; however, it is time-consuming and prone to errors. Consequently, a navigating tool is designed to help the annotators to insert errors’ codes in order to make the error-annotation process more efficient and consistent. To assure accuracy, error annotation procedure is followed to annotate OLEC and some preliminary findings are noted. One of the main results of this procedure is creating an error-annotation system based on the Omani learners of English language production. Because OLEC is still in the first stages, the primary findings are related to only one level of proficiency and one error type which is verb related errors. It is found that Omani learners in OLEC has the tendency to have more errors in forming the verb and followed by problems in agreement of verb. Comparing the results to other error-based studies indicate that the Omani learners tend to have basic verb errors which can found in lower-level of proficiency. To this end, it is essential to admit that examining learners’ errors can give insights to language acquisition and language learning and most errors do not happen randomly but they occur systematically among language learners.Keywords: error-annotation system, error-annotation manual, learner corpora, verbs related errors
Procedia PDF Downloads 1392746 High Fidelity Interactive Video Segmentation Using Tensor Decomposition, Boundary Loss, Convolutional Tessellations, and Context-Aware Skip Connections
Authors: Anthony D. Rhodes, Manan Goel
Abstract:
We provide a high fidelity deep learning algorithm (HyperSeg) for interactive video segmentation tasks using a dense convolutional network with context-aware skip connections and compressed, 'hypercolumn' image features combined with a convolutional tessellation procedure. In order to maintain high output fidelity, our model crucially processes and renders all image features in high resolution, without utilizing downsampling or pooling procedures. We maintain this consistent, high grade fidelity efficiently in our model chiefly through two means: (1) we use a statistically-principled, tensor decomposition procedure to modulate the number of hypercolumn features and (2) we render these features in their native resolution using a convolutional tessellation technique. For improved pixel-level segmentation results, we introduce a boundary loss function; for improved temporal coherence in video data, we include temporal image information in our model. Through experiments, we demonstrate the improved accuracy of our model against baseline models for interactive segmentation tasks using high resolution video data. We also introduce a benchmark video segmentation dataset, the VFX Segmentation Dataset, which contains over 27,046 high resolution video frames, including green screen and various composited scenes with corresponding, hand-crafted, pixel-level segmentations. Our work presents a improves state of the art segmentation fidelity with high resolution data and can be used across a broad range of application domains, including VFX pipelines and medical imaging disciplines.Keywords: computer vision, object segmentation, interactive segmentation, model compression
Procedia PDF Downloads 1192745 Architectural Wind Data Maps Using an Array of Wireless Connected Anemometers
Authors: D. Serero, L. Couton, J. D. Parisse, R. Leroy
Abstract:
In urban planning, an increasing number of cities require wind analysis to verify comfort of public spaces and around buildings. These studies are made using computer fluid dynamic simulation (CFD). However, this technique is often based on wind information taken from meteorological stations located at several kilometers of the spot of analysis. The approximated input data on project surroundings produces unprecise results for this type of analysis. They can only be used to get general behavior of wind in a zone but not to evaluate precise wind speed. This paper presents another approach to this problem, based on collecting wind data and generating an urban wind cartography using connected ultrasound anemometers. They are wireless devices that send immediate data on wind to a remote server. Assembled in array, these devices generate geo-localized data on wind such as speed, temperature, pressure and allow us to compare wind behavior on a specific site or building. These Netatmo-type anemometers communicate by wifi with central equipment, which shares data acquired by a wide variety of devices such as wind speed, indoor and outdoor temperature, rainfall, and sunshine. Beside its precision, this method extracts geo-localized data on any type of site that can be feedback looped in the architectural design of a building or a public place. Furthermore, this method allows a precise calibration of a virtual wind tunnel using numerical aeraulic simulations (like STAR CCM + software) and then to develop the complete volumetric model of wind behavior over a roof area or an entire city block. The paper showcases connected ultrasonic anemometers, which were implanted for an 18 months survey on four study sites in the Grand Paris region. This case study focuses on Paris as an urban environment with multiple historical layers whose diversity of typology and buildings allows considering different ways of capturing wind energy. The objective of this approach is to categorize the different types of wind in urban areas. This, particularly the identification of the minimum and maximum wind spectrum, helps define the choice and performance of wind energy capturing devices that could be implanted there. The localization on the roof of a building, the type of wind, the altimetry of the device in relation to the levels of the roofs, the potential nuisances generated. The method allows identifying the characteristics of wind turbines in order to maximize their performance in an urban site with turbulent wind.Keywords: computer fluid dynamic simulation in urban environment, wind energy harvesting devices, net-zero energy building, urban wind behavior simulation, advanced building skin design methodology
Procedia PDF Downloads 1002744 Acute Cartilage Defects of the Knee Treated With Chondral Restoration Procedures and Patellofemoral Stabilisation
Authors: John Scanlon, Antony Raymond, Randeep Aujla, Peter D’Alessandro, Satyen Gohil
Abstract:
Background: The incidence of significant acute chondral injuries with patella dislocation is around 10-15%. It is accepted that chondral procedures should only be performed in the presence of joint stability Methods:Patients were identified from surgeon/hospital logs. Patient demographics, lesion size and location, surgical procedure, patient reported outcome measures, post-operative MR imaging, and complications were recorded. PROMs and patient satisfaction was obtained. Results:20 knees (18 patients) were included. Mean age was 18.6 years (range; 11-39), and the mean follow-up was 16.6 months (range; 2-70). The defect locations were the lateral femoral condyle (9/20; 45%), patella (9/20; 45%), medial femoral condyle (1/20; 5%) and the trochlea (1/20; 5%). The mean defect size was 2.6cm2. Twelve knees were treated with cartilage fixation, 5 with microfracture, and 3 with OATS. At follow up, the overall mean Lysholm score was 77.4 (± 17.1), with no chondral regenerative procedure being statistically superior. There was no difference in Lysholm scores between those patients having acute medial patellofemoral ligament reconstruction versus medial soft tissue plication (p=0.59). Five (25%) knees required re-operation (one arthroscopic arthrolysis; one patella chondroplasty; two removal of loose bodies; one implant adjustment). Overall, 90% responded as being satisfied with surgery. Conclusion: Our aggressive pathway to identify and treat acute cartilage defects with early operative intervention and patella stabilisation has shown high rates of satisfaction and Lysholm scores. The full range of chondral restoration options should be considered by surgeons managing these patients.Keywords: patella dislocation, chondral restoration, knee, patella stabilisation
Procedia PDF Downloads 1252743 Ultraviolet Lasing from Vertically-Aligned ZnO Nanowall Array
Authors: Masahiro Takahashi, Kosuke Harada, Shihomi Nakao, Mitsuhiro Higashihata, Hiroshi Ikenoue, Daisuke Nakamura, Tatsuo Okada
Abstract:
Zinc oxide (ZnO) is one of the light emitting materials in ultraviolet (UV) region. In addition, ZnO nanostructures are also attracting increasing research interest as building blocks for UV optoelectronic applications. We have succeeded in synthesizing vertically-aligned ZnO nanostructures by laser interference patterning, which is catalyst-free and non-contact technique. In this study, vertically-aligned ZnO nanowall arrays were synthesized using two-beam interference. The maximum height and average thickness of the ZnO nanowalls were about 4.5 µm and 200 nm, respectively. UV lasing from a piece of the ZnO nanowall was obtained under the third harmonic of a Q-switched Nd:YAG laser excitation, and the estimated threshold power density for lasing was about 150 kW/cm2. Furthermore, UV lasing from the vertically-aligned ZnO nanowall was also achieved. The results indicate that ZnO nanowalls can be applied to random laser.Keywords: zinc oxide, nanowall, interference laser, UV lasing
Procedia PDF Downloads 502