Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38

Search results for: Marcus Cheetham

38 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments

Authors: Ana Londral, Burcu Demiray, Marcus Cheetham

Abstract:

Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.

Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation

Procedia PDF Downloads 213
37 Liver Lesion Extraction with Fuzzy Thresholding in Contrast Enhanced Ultrasound Images

Authors: Abder-Rahman Ali, Adélaïde Albouy-Kissi, Manuel Grand-Brochier, Viviane Ladan-Marcus, Christine Hoeffl, Claude Marcus, Antoine Vacavant, Jean-Yves Boire

Abstract:

In this paper, we present a new segmentation approach for focal liver lesions in contrast enhanced ultrasound imaging. This approach, based on a two-cluster Fuzzy C-Means methodology, considers type-II fuzzy sets to handle uncertainty due to the image modality (presence of speckle noise, low contrast, etc.), and to calculate the optimum inter-cluster threshold. Fine boundaries are detected by a local recursive merging of ambiguous pixels. The method has been tested on a representative database. Compared to both Otsu and type-I Fuzzy C-Means techniques, the proposed method significantly reduces the segmentation errors.

Keywords: defuzzification, fuzzy clustering, image segmentation, type-II fuzzy sets

Procedia PDF Downloads 382
36 Coping with the Stress and Negative Emotions of Care-Giving by Using Techniques from Seneca, Epictetus, and Marcus Aurelius

Authors: Arsalan Memon

Abstract:

There are many challenges that a caregiver faces in average everyday life. One such challenge is coping with the stress and negative emotions of caregiving. The Stoics (i.e. Lucius Annaeus Seneca [4 B.C.E. - 65 C.E.], Epictetus [50-135 C.E.], and Marcus Aurelius [121-180 C.E.]) have provided coping techniques that are useful for dealing with stress and negative emotions. This paper lists and explains some of the fundamental coping techniques provided by the Stoics. For instance, some Stoic coping techniques thus follow (the list is far from exhaustive): a) mindfulness: to the best of your ability, constantly being aware of your thoughts, habits, desires, norms, memories, likes/dislikes, beliefs, values, and of everything outside of you in the world (b) constantly adjusting one’s expectations in accordance with reality, c) memento mori: constantly reminding oneself that death is inevitable and that death is not to be seen as evil, and d) praemeditatio malorum: constantly detaching oneself from everything that is so dear to one so that the least amount of suffering follows from the loss, damage, or ceasing to be of such entities. All coping techniques will be extracted from the following original texts by the Stoics: Seneca’s Letters to Lucilius, Epictetus’ Discourses and the Encheiridion, and Marcus Aurelius’ Meditations. One major finding is that the usefulness of each Stoic coping technique can be empirically tested by anyone in the sense of applying it one’s own life especially when one is facing real-life challenges. Another major finding is that all of the Stoic coping techniques are predicated upon, and follow from, one fundamental principle: constantly differentiate what is and what is not in one’s control. After differentiating it, one should constantly habituate oneself in not controlling things that are beyond one’s control. For example, the following things are beyond one’s control (all things being equal): death, certain illnesses, being born in a particular socio-economic family, etc. The conclusion is that if one habituates oneself by practicing to the best of one’s ability both the fundamental Stoic principle and the Stoic coping techniques, then such a habitual practice can eventually decrease the stress and negative emotions that one experiences by being a caregiver.

Keywords: care-giving, coping techniques, negative emotions, stoicism, stress

Procedia PDF Downloads 75
35 Conductive and Stretchable Graphene Nanoribbon Coated Textiles

Authors: Lu Gan, Songmin Shang, Marcus Chun Wah Yuen

Abstract:

A conductive and stretchable cotton fabric was prepared in this study through coating the graphene nanoribbon onto the cotton fabric. The mechanical and electrical properties of the prepared cotton fabric were then investigated. As shown in the results, the graphene nanoribbon coated cotton fabric had an improvement in both mechanical strength and electrical conductivity. Moreover, the resistance of the cotton fabric had a linear dependence on the strain applied to it. The prepared graphene nanoribbon coated cotton fabric has great application potentials in smart textile industry.

Keywords: conductive fabric, graphene nanoribbon, coating, enhanced properties

Procedia PDF Downloads 260
34 Evaluating the Effects of Weather and Climate Change to Risks in Crop Production

Authors: Marcus Bellett-Travers

Abstract:

Different modelling approaches have been used to determine or predict yield of crops in different geographies. Central to the methodologies are the presumption that it is the absolute yield of the crop in a given location that is of the highest priority to those requiring information on crop productivity. Most individuals, companies and organisations within the agri-food sector need to be able to balance the supply of crops with the demand for them. Different modelling approaches have been used to determine and predict crop yield. The growing need to ensure certainty of supply and stability of prices requires an approach that describes the risk in producing a crop. A review of current methodologies to evaluate the risk to food production from changes in the weather and climate is presented.

Keywords: crop production, risk, climate, modelling

Procedia PDF Downloads 304
33 Friction Estimation and Compensation for Steering Angle Control for Highly Automated Driving

Authors: Marcus Walter, Norbert Nitzsche, Dirk Odenthal, Steffen Müller

Abstract:

This contribution presents a friction estimator for industrial purposes which identifies Coulomb friction in a steering system. The estimator only needs a few, usually known, steering system parameters. Friction occurs on almost every mechanical system and has a negative influence on high-precision position control. This is demonstrated on a steering angle controller for highly automated driving. In this steering system the friction induces limit cycles which cause oscillating vehicle movement when the vehicle follows a given reference trajectory. When compensating the friction with the introduced estimator, limit cycles can be suppressed. This is demonstrated by measurements in a series vehicle.

Keywords: friction estimation, friction compensation, steering system, lateral vehicle guidance

Procedia PDF Downloads 420
32 Global Direct Search Optimization of a Tuned Liquid Column Damper Subject to Stochastic Load

Authors: Mansour H. Alkmim, Adriano T. Fabro, Marcus V. G. De Morais

Abstract:

In this paper, a global direct search optimization algorithm to reduce vibration of a tuned liquid column damper (TLCD), a class of passive structural control device, is presented. The objective is to find optimized parameters for the TLCD under stochastic load from different wind power spectral density. A verification is made considering the analytical solution of an undamped primary system under white noise excitation. Finally, a numerical example considering a simplified wind turbine model is given to illustrate the efficacy of the TLCD. Results from the random vibration analysis are shown for four types of random excitation wind model where the response PSDs obtained showed good vibration attenuation.

Keywords: generalized pattern search, parameter optimization, random vibration analysis, vibration suppression

Procedia PDF Downloads 149
31 National Strategy for Swedish Wildlife Management

Authors: Maria Hornell, Marcus Ohman

Abstract:

Nature, and the society it is a part of, is under constant change. The landscape, climate and game populations vary over time, as well as society's priorities and the way it uses the land where wildlife may proliferate. Sweden currently has historically large wildlife populations which are a resource for the benefit and joy of many people. Wildlife may also be seen as a problem as it may cause damage in contradiction to other human interests. The Swedish Environmental Protection Agency introduces a new long-term strategy for national wildlife management. The strategy envisions a wildlife management in balance. It focuses on wildlife values in a broad sense including outdoor recreation and tourism as well as conservation of biodiversity. It is fundamental that these values should be open and accessible for the major part of the population. For that to be possible new ways to manage, mitigate and prevent damages and other problems that wildlife causes need to be developed. The strategy describes a roadmap for the development and strengthening of Sweden's wildlife management until 2020. It aims at being applicable for those authorities and stakeholders with interest in wildlife management being a guide for their own strategies, goals, and activities.

Keywords: wildlife management, strategy, Sweden, SEPA

Procedia PDF Downloads 145
30 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs

Authors: Josef Slapal

Abstract:

Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.

Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency

Procedia PDF Downloads 281
29 Laser Micro-Welding of an Isomorphous System with Different Geometries: An Investigation on the Mechanical Properties and Microstructure of the Joint

Authors: Mahdi Amne Elahi, Marcus Koch, Peter Plapper

Abstract:

Due to the demand of miniaturizing in automotive industry, the application of laser welding is quite promising. The current study focused on laser micro-welding of CuSn6 bronze and nickel wire for a miniature electromechanical hybrid component. Due to the advantages of laser welding, the welding can be tailored specifically for the requirements of the part. Scanning electron and optical microscopy were implemented to study the microstructure and tensile-shear test was selected to represent the mechanical properties. Different welding sides, beam oscillations, and speeds have been investigated to optimize the tensile-shear load and microstructure. The results show that the mechanical properties and microstructure of the joint is highly under the influence of the mentioned parameters. Due to the lack of intermetallic compounds, the soundness of the joint is achievable by manipulating the geometry of the weld seam and minimize weld defects.

Keywords: bronze, laser micro-welding, microstructure, nickel, tensile shear test

Procedia PDF Downloads 63
28 Use of Natural Fibers in Landfill Leachate Treatment

Authors: Araujo J. F. Marina, Araujo F. Marcus Vinicius, Mulinari R. Daniella

Abstract:

Due to the resultant leachate from waste decomposition in landfills has polluter potential hundred times greater than domestic sewage, this is considered a problem related to the depreciation of environment requiring pre-disposal treatment. In seeking to improve this situation, this project proposes the treatment of landfill leachate using natural fibers intercropped with advanced oxidation processes. The selected natural fibers were palm, coconut and banana fiber. These materials give sustainability to the project because, besides having adsorbent capacity, are often part of waste discarded. The study was conducted in laboratory scale. In trials, the effluents were characterized as Chemical Oxygen Demand (COD), Turbidity and Color. The results indicate that is technically promising since that there were extremely oxidative conditions, the use of certain natural fibers in the reduction of pollutants in leachate have been obtained results of COD removals between 67.9% and 90.9%, Turbidity between 88.0% and 99.7% and Color between 67.4% and 90.4%. The expectation generated is to continue evaluating the association of efficiency of other natural fibers with other landfill leachate treatment processes.

Keywords: lndfill leachate, chemical treatment, natural fibers, advanced oxidation processes

Procedia PDF Downloads 276
27 Shotcrete Performance Optimisation and Audit Using 3D Laser Scanning

Authors: Carlos Gonzalez, Neil Slatcher, Marcus Properzi, Kan Seah

Abstract:

In many underground mining operations, shotcrete is used for permanent rock support. Shotcrete thickness is a critical measure of the success of this process. 3D Laser Mapping, in conjunction with Jetcrete, has developed a 3D laser scanning system specifically for measuring the thickness of shotcrete. The system is mounted on the shotcrete spraying machine and measures the rock faces before and after spraying. The calculated difference between the two 3D surface models is measured as the thickness of the sprayed concrete. Typical work patterns for the shotcrete process required a rapid and automatic system. The scanning takes place immediately before and after the application of the shotcrete so no convergence takes place in the interval between scans. Automatic alignment of scans without targets was implemented which allows for the possibility of movement of the spraying machine between scans. Case studies are presented where accuracy tests are undertaken and automatic audit reports are calculated. The use of 3D imaging data for the calculation of shotcrete thickness is an important tool for geotechnical engineers and contract managers, and this could become the new state-of-the-art methodology for the mining industry.

Keywords: 3D imaging, shotcrete, surface model, tunnel stability

Procedia PDF Downloads 220
26 An International Comparison of Forensic Identification Evidence Legislation: Balancing Community Interests and Individual Rights

Authors: Marcus Smith

Abstract:

DNA profiling has made a valuable contribution to criminal investigations over the past thirty years. Direct matching DNA profiles from a crime scene and suspect, or between a suspect and a database remain of great importance to crimes such as murder, assault, and property theft. As scientific and technological advancement continues, a wide range of new DNA profiling applications has been developed. The application of new techniques involves an interesting balancing act between admitting probative evidence in a criminal trial, evaluating its degree of relevance and validity, and limiting its prejudicial impact. The impact of new DNA profiling applications that have significant implications for law enforcement and the legal system can be evaluated through a review of relevant case law, legislation and the latest empirical evidence from jurisdictions around the world including the United States, United Kingdom, and Australia. There are benefits in further examining the implications of these new developments, including how the criminal law can best be adapted to ensure that new technology is used to enhance criminal investigation and prosecution while ensuring it is applied in a measured way that respects individual rights and maintains principles of fairness enshrined in the legal system.

Keywords: criminal procedure, forensic evidence, DNA profiling, familial searching, phenotyping

Procedia PDF Downloads 65
25 Shoring System Selection for Deep Excavation

Authors: Faouzi Ahtchi-Ali, Marcus Vitiello

Abstract:

A study was conducted in the east region of the Middle East to assess the constructability of a shoring system for a 12-meter deep excavation. Several shoring systems were considered in this study including secant concrete piling, contiguous concrete piling, and sheet-piling. The excavation was carried out in a very dense sand with the groundwater level located at 3 meters below ground surface. The study included conducting a pilot test for each shoring system listed above. The secant concrete piling included overlapping concrete piles to a depth of 16 meters. Drilling method with full steel casing was utilized to install the concrete piles. The verticality of the piles was a concern for the overlap. The contiguous concrete piling required the installation of micro-piles to seal the gap between the concrete piles. This method revealed that the gap between the piles was not fully sealed as observed by the groundwater penetration to the excavation. The sheet-piling method required pre-drilling due to the high blow count of the penetrated layer of saturated sand. This study concluded that the sheet-piling method with pre-drilling was the most cost effective and recommended a method for the shoring system.

Keywords: excavation, shoring system, middle east, Drilling method

Procedia PDF Downloads 371
24 Seasonal Influence on Environmental Indicators of Beach Waste

Authors: Marcus C. Garcia, Giselle C. Guimarães, Luciana H. Yamane, Renato R. Siman

Abstract:

The environmental indicators and the classification of beach waste are essential tools to diagnose the current situation and to indicate ways to improve the quality of this environment. The purpose of this paper was to perform a quali-quantitative analysis of the beach waste on the Curva da Jurema Beach (Espírito Santo - Brazil). Three transects were used with equidistant positioning over the total length of the beach for the solid waste collection. Solid wastes were later classified according to their use and primary raw material from the low and high summer season. During the low season, average values of 7.10 items.m-1, 18.22 g.m-1 and 0.91 g.m-2 were found for the whole beach, and transect 3 contributed the most waste, with the total sum of items equal to 999 (49%), a total mass of 5.62 kg and a total volume of 21.31 L. During the high summer season, average values of 8.22 items.m-1, 54.40 g.m-1 and 2.72 g.m-2 were found, with transect 2 contributing the most to the total sum with 1,212 items (53%), a total mass of 10.76 kg and a total volume of 51.99 L. Of the total collected, plastic materials represented 51.4% of the total number of items, 35.9% of the total mass and 68% of the total volume. The implementation of reactive and proactive measures is necessary so that the management of the solid wastes on Curva da Jurema Beach is in accordance with principles of sustainability.

Keywords: beach solid waste, environmental indicators, quali-quantitative analysis, waste management

Procedia PDF Downloads 229
23 Child Rights in the Context of Psychiatric Power

Authors: Dmytro D. Buiadzhy

Abstract:

The modern psychiatric discourse proves the existence of the direct ties between the children's mental health and their success in life as adults. The unresolved mental health problems in childhood are likely to lead individuals to poverty, isolation, and social exclusion as stated by Marcus Richards. Such an approach justifies the involvement of children in the view of supervision and control of power. The discourse, related to the mental health of children, provides a tight impact of family, educational institutions and medical authorities on the child through any manifestations of his psychic, having signs of "abnormality.” Throughout the adult life, the individual continues to feel the pressure of power through legal, political, and economic institutions that also appeal to the mental health regulation. The juvenile law declares the equality of a child and an adult, but in fact simply delegates the powers of parents to impersonal social institutions of the guardianship, education, and social protection. The psychiatric power in this study is considered in accordance with the Michel Foucault’s concept of power as a manifestation of "positive" technologies of power, which include various manifestations of subjectivity, in particular children’s one, in a view of supervision and control of the state power. The main issue disclosed in this paper is how weakening of the parental authority, in the context of legislative ratification of the child rights, strengthens the other forms of power over children, especially the psychiatric power, which justifies and affects the children mancipation.

Keywords: child rights, psychiatric power, discourse, parental authority

Procedia PDF Downloads 273
22 The Challenge of Characterising Drought Risk in Data Scarce Regions: The Case of the South of Angola

Authors: Natalia Limones, Javier Marzo, Marcus Wijnen, Aleix Serrat-Capdevila

Abstract:

In this research we developed a structured approach for the detection of areas under the highest levels of drought risk that is suitable for data-scarce environments. The methodology is based on recent scientific outcomes and methods and can be easily adapted to different contexts in successive exercises. The research reviews the history of drought in the south of Angola and characterizes the experienced hazard in the episode from 2012, focusing on the meteorological and the hydrological drought types. Only global open data information coming from modeling or remote sensing was used for the description of the hydroclimatological variables since there is almost no ground data in this part of the country. Also, the study intends to portray the socioeconomic vulnerabilities and the exposure to the phenomenon in the region to fully understand the risk. As a result, a map of the areas under the highest risk in the south of the country is produced, which is one of the main outputs of this work. It was also possible to confirm that the set of indicators used revealed different drought vulnerability profiles in the South of Angola and, as a result, several varieties of priority areas prone to distinctive impacts were recognized. The results demonstrated that most of the region experienced a severe multi-year meteorological drought that triggered an unprecedent exhaustion of the surface water resources, and that the majority of their socioeconomic impacts started soon after the identified onset of these processes.

Keywords: drought risk, exposure, hazard, vulnerability

Procedia PDF Downloads 114
21 Accelerated Molecular Simulation: A Convolution Approach

Authors: Jannes Quer, Amir Niknejad, Marcus Weber

Abstract:

Computational Drug Design is often based on Molecular Dynamics simulations of molecular systems. Molecular Dynamics can be used to simulate, e.g., the binding and unbinding event of a small drug-like molecule with regard to the active site of an enzyme or a receptor. However, the time-scale of the overall binding event is many orders of magnitude longer than the time-scale of simulation. Thus, there is a need to speed-up molecular simulations. In order to speed up simulations, the molecular dynamics trajectories have to be ”steared” out of local minimizers of the potential energy surface – the so-called metastabilities – of the molecular system. Increasing the kinetic energy (temperature) is one possibility to accelerate simulated processes. However, with temperature the entropy of the molecular system increases, too. But this kind ”stearing” is not directed enough to stear the molecule out of the minimum toward the saddle point. In this article, we give a new mathematical idea, how a potential energy surface can be changed in such a way, that entropy is kept under control while the trajectories are still steared out of the metastabilities. In order to compute the unsteared transition behaviour based on a steared simulation, we propose to use extrapolation methods. In the end we mathematically show, that our method accelerates the simulations along the direction, in which the curvature of the potential energy surface changes the most, i.e., from local minimizers towards saddle points.

Keywords: extrapolation, Eyring-Kramers, metastability, multilevel sampling

Procedia PDF Downloads 249
20 Challenges and Insights by Electrical Characterization of Large Area Graphene Layers

Authors: Marcus Klein, Martina GrießBach, Richard Kupke

Abstract:

The current advances in the research and manufacturing of large area graphene layers are promising towards the introduction of this exciting material in the display industry and other applications that benefit from excellent electrical and optical characteristics. New production technologies in the fabrication of flexible displays, touch screens or printed electronics apply graphene layers on non-metal substrates and bring new challenges to the required metrology. Traditional measurement concepts of layer thickness, sheet resistance, and layer uniformity, are difficult to apply to graphene production processes and are often harmful to the product layer. New non-contact sensor concepts are required to adapt to the challenges and even the foreseeable inline production of large area graphene. Dedicated non-contact measurement sensors are a pioneering method to leverage these issues in a large variety of applications, while significantly lowering the costs of development and process setup. Transferred and printed graphene layers can be characterized with high accuracy in a huge measurement range using a very high resolution. Large area graphene mappings are applied for process optimization and for efficient quality control for transfer, doping, annealing and stacking processes. Examples of doped, defected and excellent Graphene are presented as quality images and implications for manufacturers are explained.

Keywords: graphene, doping and defect testing, non-contact sheet resistance measurement, inline metrology

Procedia PDF Downloads 225
19 Engineering Ligand-Free Biodegradable-Based Nanoparticles for Cell Attachment and Growth

Authors: Simone F. Medeiros, Isabela F. Santos, Rodolfo M. Moraes, Jaspreet K. Kular, Marcus A. Johns, Ram Sharma, Amilton M. Santos

Abstract:

Tissue engineering aims to develop alternatives to treat damaged tissues by promoting their regeneration. Its basic principle is to place cells on a scaffold capable of promoting cell functions, and for this purpose, polymeric nanoparticles have been successfully used due to the ability of some macro chains to mimic the extracellular matrix and influence cell functions. In general, nanoparticles require surface chemical modification to achieve cell adhesion, and recent advances in their synthesis include methods for modifying the ligand density and distribution onto nanoparticles surface. However, this work reports the development of biodegradable polymeric nanoparticles capable of promoting cellular adhesion without any surface chemical modification by ligands. Biocompatible and biodegradable nanoparticles based on poly(3-hydroxybutyrate-co-3-hydroxyvalerate) (PHBHV) were synthesized by solvent evaporation method. The produced nanoparticles were small in size (85 and 125 nm) and colloidally stable against time in aqueous solution. Morphology evaluation showed their spherical shape with small polydispersity. Human osteoblast-like cells (MG63) were cultured in the presence of PHBHV nanoparticles, and growth kinetics were compared to those grown on tissue culture polystyrene (TCPS). Cell attachment on non-tissue culture polystyrene (non-TCPS) pre-coated with nanoparticles was assessed and compared to attachment on TCPS. These findings reveal the potential of PHBHV nanoparticles for cell adhesion and growth, without requiring a matrix ligand to support cells, to be used as scaffolds, in tissue engineering applications.

Keywords: tissue engineering, PHBHV, stem cells, cellular attachment

Procedia PDF Downloads 139
18 The Effectiveness of Cash Flow Management by SMEs in the Mafikeng Local Municipality of South Africa

Authors: Ateba Benedict Belobo, Faan Pelser, Ambe Marcus

Abstract:

Aims: This study arise from repeated complaints from both electronic mails about the underperformance of Mafikeng Small and Medium-Size enterprises after the global financial crisis. The authors were on the view that, this poor performance experienced could be as a result of the negative effects on the cash flow of these businesses due to volatilities in the business environment in general prior to the global crisis. Thus, the paper was mainly aimed at determining the shortcomings experienced by these SMEs with regards to cash flow management. It was also aimed at suggesting possible measures to improve cash flow management of these SMEs in this tough time. Methods: A case study was conducted on 3 beverage suppliers, 27 bottle stores, 3 largest fast consumer goods super markets and 7 automobiles enterprises in the Mafikeng local municipality. A mixed method research design was employed and a purposive sampling was used in selecting SMEs that participated. Views and experiences of participants of the paper were captured through in-depth interviews. Data from the empirical investigation were interpreted using open coding and a simple percentage formula. Results: Findings from the empirical research reflected that majority of Mafikeng SMEs suffer poor operational performance prior to the global financial crisis primarily as a result of poor cash flow management. However, the empirical outcome also indicted other secondary factors contributing to this poor operational performance. Conclusion: Finally, the authorsproposed possible measures that could be used to improve cash flow management and to solve other factors affecting operational performance of SMEs in the Mafikeng local municipality in other to achieve a better business performance.

Keywords: cash flow, business performance, global financial crisis, SMEs

Procedia PDF Downloads 364
17 The Media and Reportage of Boko Haram Insurgency in Nigeria

Authors: Priscilla Marcus

Abstract:

The mass media was a force to reckon with in the struggle and attainment of Nigeria’s independence in 1960 and since then, the Nigerian media has carved a niche for itself in performing its traditional role of education, information, entertainment, shaping of opinions and swinging of views of the society on knotty national issues. Boko Haram insurgency in Nigeria which emerged from an unnoticed, negligible and quiet beginning, has turned out daring, monstrous and unstoppable. This paper examines The Media and Reportage of Boko Haram Insurgency in Nigeria and to suggest strategies the mass media could adopt in combating this form of terrorism. Data for the study were collected from a variety of sources including the print and electronic media. The major observation of this study is that the mass media have an enormous role to play if Boko Haram’s activities are to be combated. It argued that even though the media houses are just doing their job – reporting the incident(s) as they occur, thus keeping the citizens abreast of facts; the rate at which news keeps coming regarding the activities of the sect has portrayed the media as information dissemination and terror campaign spread. It also argued that the ceaseless reporting has not translated to a decrease in the activities of the sect or increase in the level of government actions to check the insurgency. However, the information being disseminated is enlightening the populace and also creating an atmosphere of panic and insecurity. It further argued that the media should move beyond mere recitation of events to providing the public with knowledge needed to make things better. This is because the sect has been accorded too much undeserved and unnecessary publicity while the government on the other hand has been portrayed, albeit indirectly as a weak organization incapable of handling the ‘more organized’ Boko Haram. The study, concluded that, to effectively address the problem of this form of terrorism in Nigeria, the media have to brace up to the task of uncovering activities of the sect in appreciation of their watch-dog role.

Keywords: Boko Haram, insurgency, mass media, Nigeria

Procedia PDF Downloads 204
16 Linearly Polarized Single Photon Emission from Nonpolar, Semipolar and Polar Quantum Dots in GaN/InGaN Nanowires

Authors: Snezana Lazic, Zarko Gacevic, Mark Holmes, Ekaterina Chernysheva, Marcus Müller, Peter Veit, Frank Bertram, Juergen Christen, Yasuhiko Arakawa, Enrique Calleja

Abstract:

The study reports how the pencil-like morphology of a homoepitaxially grown GaN nanowire can be exploited for the fabrication of a thin conformal InGaN nanoshell, hosting nonpolar, semipolar and polar single photon sources (SPSs). All three SPS types exhibit narrow emission lines (FWHM~0.35 - 2 meV) and high degrees of linear optical polarization (P > 70%) in the low-temperature micro-photoluminescence (µ-PL) experiments and are characterized by a pronounced antibunching in the photon correlation measurements (gcorrected(2)(0) < 0.3). The quantum-dot-like exciton localization centers induced by compositional fluctuations within the InGaN nanoshell are identified as the driving mechanism for the single photon emission. As confirmed by the low-temperature transmission electron microscopy combined with cathodoluminescence (TEM-CL) study, the crystal region (i.e. non-polar m-, semi-polar r- and polar c-facets) hosting the single photon emitters strongly affects their emission wavelength, which ranges from ultra-violet for the non-polar to visible for the polar SPSs. The photon emission lifetime is also found to be facet-dependent and varies from sub-nanosecond time scales for the non- and semi-polar SPSs to a few nanoseconds for the polar ones. These differences are mainly attributed to facet-dependent indium content and electric field distribution across the hosting InGaN nanoshell. The hereby reported pencil-like InGaN nanoshell is the first single nanostructure able to host all three types of single photon emitters and is thus a promising building block for tunable quantum light devices integrated into future photonic and optoelectronic circuits.

Keywords: GaN nanowire, InGaN nanoshell, linear polarization, nonpolar, semipolar, polar quantum dots, single-photon sources

Procedia PDF Downloads 301
15 Profile of Serological Response of Equids Naturally Infected with Burkholderia mallei

Authors: Iahtasham Khan, Vania Lucia De Assis Santana, Marcilia Maria Alves De Souza, Mabel Hanna Vance Harrop, Fernando Leandro Dos Santos, Cecília Maria Souza Leão E. Silva, Pedro Paulo Silveira, Marcelo Brasil, Marcus Vinícius, Hélio Cordeiro Manso Filho, Muhammad Younus, Aman Ullah Khan

Abstract:

Glanders ranks high on clinical lists in some regions of Brazil as a cause of respiratory and lymphatic disease in equids. Glanders is caused by Burkolderia mallei (B. mallei) Gram-negative bacterium. B. mallei was first biological agent used in World War I in 20th century. The complement fixation test (CFT) is a serodiagnostic tool prescribed by the World Organization for Animal Health (OIE)for the diagnosis of glanders in the international trade of equids. The aim of the present study was to monitor the serological responses in equines naturally infected with B. mallei using the CFT. A total of 574 equids were tested with CFT, 30 days apart in a total of 12 samplings. One hundred thirty-four sera tested negative in all samplings; 192 sera tested positive in one sampling and 125 sera tested positive in two or more samplings. Remaining 123 samples showed uncertain results. Thus, CFT results can vary over a period of time. These variations could be the consequence of the effects of the natural immune response in each animal. The findings of the present study demonstrate difficulties regarding the simultaneous implementation of CFT and test and slaughter policies to eradicate glanders. Another constraint to control this disease is the presence of carrier/transitory CFT-negative animals, which are a potential source of disease in glanders-free areas. Serodiagnostic tests of higher sensitivity and specificity like immunobloat should be implemented to achieve success in the eradication of glanders.

Keywords: glanders, equids, horses, immunological, mules

Procedia PDF Downloads 313
14 Constraint-Based Computational Modelling of Bioenergetic Pathway Switching in Synaptic Mitochondria from Parkinson's Disease Patients

Authors: Diana C. El Assal, Fatima Monteiro, Caroline May, Peter Barbuti, Silvia Bolognin, Averina Nicolae, Hulda Haraldsdottir, Lemmer R. P. El Assal, Swagatika Sahoo, Longfei Mao, Jens Schwamborn, Rejko Kruger, Ines Thiele, Kathrin Marcus, Ronan M. T. Fleming

Abstract:

Degeneration of substantia nigra pars compacta dopaminergic neurons is one of the hallmarks of Parkinson's disease. These neurons have a highly complex axonal arborisation and a high energy demand, so any reduction in ATP synthesis could lead to an imbalance between supply and demand, thereby impeding normal neuronal bioenergetic requirements. Synaptic mitochondria exhibit increased vulnerability to dysfunction in Parkinson's disease. After biogenesis in and transport from the cell body, synaptic mitochondria become highly dependent upon oxidative phosphorylation. We applied a systems biochemistry approach to identify the metabolic pathways used by neuronal mitochondria for energy generation. The mitochondrial component of an existing manual reconstruction of human metabolism was extended with manual curation of the biochemical literature and specialised using omics data from Parkinson's disease patients and controls, to generate reconstructions of synaptic and somal mitochondrial metabolism. These reconstructions were converted into stoichiometrically- and fluxconsistent constraint-based computational models. These models predict that Parkinson's disease is accompanied by an increase in the rate of glycolysis and a decrease in the rate of oxidative phosphorylation within synaptic mitochondria. This is consistent with independent experimental reports of a compensatory switching of bioenergetic pathways in the putamen of post-mortem Parkinson's disease patients. Ongoing work, in the context of the SysMedPD project is aimed at computational prediction of mitochondrial drug targets to slow the progression of neurodegeneration in the subset of Parkinson's disease patients with overt mitochondrial dysfunction.

Keywords: bioenergetics, mitochondria, Parkinson's disease, systems biochemistry

Procedia PDF Downloads 203
13 Application of Human Biomonitoring and Physiologically-Based Pharmacokinetic Modelling to Quantify Exposure to Selected Toxic Elements in Soil

Authors: Eric Dede, Marcus Tindall, John W. Cherrie, Steve Hankin, Christopher Collins

Abstract:

Current exposure models used in contaminated land risk assessment are highly conservative. Use of these models may lead to over-estimation of actual exposures, possibly resulting in negative financial implications due to un-necessary remediation. Thus, we are carrying out a study seeking to improve our understanding of human exposure to selected toxic elements in soil: arsenic (As), cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb) resulting from allotment land-use. The study employs biomonitoring and physiologically-based pharmacokinetic (PBPK) modelling to quantify human exposure to these elements. We recruited 37 allotment users (adults > 18 years old) in Scotland, UK, to participate in the study. Concentrations of the elements (and their bioaccessibility) were measured in allotment samples (soil and allotment produce). Amount of produce consumed by the participants and participants’ biological samples (urine and blood) were collected for up to 12 consecutive months. Ethical approval was granted by the University of Reading Research Ethics Committee. PBPK models (coded in MATLAB) were used to estimate the distribution and accumulation of the elements in key body compartments, thus indicating the internal body burden. Simulating low element intake (based on estimated ‘doses’ from produce consumption records), predictive models suggested that detection of these elements in urine and blood was possible within a given period of time following exposure. This information was used in planning biomonitoring, and is currently being used in the interpretation of test results from biological samples. Evaluation of the models is being carried out using biomonitoring data, by comparing model predicted concentrations and measured biomarker concentrations. The PBPK models will be used to generate bioavailability values, which could be incorporated in contaminated land exposure models. Thus, the findings from this study will promote a more sustainable approach to contaminated land management.

Keywords: biomonitoring, exposure, PBPK modelling, toxic elements

Procedia PDF Downloads 233
12 Benchmarks to Assess the Practicality and Performance of Quantum Processors

Authors: Marcus Doherty, Jennifer Harding, Florian Preis

Abstract:

Many benchmarks have been proposed to quantify quantum computing performance, however, none have focused on the practicality of their use, which includes the ability to solve real-world problems and the physical requirements of the quantum processor. Most of these existing benchmarks focus on random benchmarking, which can be useful to characterize certain types of errors but do not prove the ability to solve real problems and are unable to characterize structured errors due to the inherent randomness and disorganization of the circuit. Existing application-oriented benchmarks also focus on the performance of the processor rather than the physical or environmental impact of the machine. Some applications of quantum computers may require on-site hosting or low energy-consumption, for example, the inclusion of quantum processors in autonomous vehicles, robotic devices, or for high-security purposes, conversely, some use-cases require only the most powerful machine without consideration of the physical or environmental footprint. We propose two benchmarks which aim to quantify the effectiveness and practicality of quantum computers for various applications. The first metric, Quantum Density, is a real-number value which quantifies computational efficiency density by considering the ability of a quantum processor to perform a given algorithm, alongside the size of the machine and power consumption and total time needed to run the algorithm. This implementation allows consideration of not only the quantum processor but also the hybrid algorithm and time resources needed to cycle between classical and quantum, for example, for logical statements, parameter update as well as recompilation to assess the full performance of the algorithm. The second benchmark, Quantum Utility, creates a set of equivalence classes based on the size, weight, and power consumption. Quantum processors can then be compared against other processors within the same class by demonstrating the relative performance in speed, accuracy, and precision on executing a given task. We introduce the proposed equivalence classes and demonstrate examples by scoring a Variational Quantum Eigensolver algorithm. These benchmarks aim to captureand quantify the propertieswhichare important to the adoption of quantum computing by industry.

Keywords: quantum algorithms, quantum benchmarking, quantum computing, quantum density, quantum utility

Procedia PDF Downloads 41
11 Open Innovation for Crowdsourced Product Development: The Case Study of Quirky.com

Authors: Ana Bilandzic, Marcus Foth, Greg Hearn

Abstract:

In a narrow sense, innovation is the invention and commercialisation of a new product or service in the marketplace. The literature suggests places that support knowledge exchange and social interaction, e.g. coffee shops, to nurture innovative ideas. With the widespread success of Internet, interpersonal communication and interaction changed. Online platforms complement physical places for idea exchange and innovation – the rise of hybrid, ‘net localities.’ Further, since its introduction in 2003 by Chesbrough, the concept of open innovation received increased attention as a topic in academic research as well as an innovation strategy applied by companies. Open innovation allows companies to seek and release intellectual property and new ideas from outside of their own company. As a consequence, the innovation process is no longer only managed within the company, but it is pursued in a co-creation process with customers, suppliers, and other stakeholders. Quirky.com (Quirky), a company founded by Ben Kaufman in 2009, recognised the opportunity given by the Internet for knowledge exchange and open innovation. Quirky developed an online platform that makes innovation available to everyone. This paper reports on a study that analysed Quirky’s business process in an extended event-driven process chain (eEPC). The aim was to determine how the platform enabled crowdsourced innovation for physical products on the Internet. The analysis reveals that key elements of the business model are based on open innovation. Quirky is an example of how open innovation can support crowdsourced and crowdfunded product ideation, development and selling. The company opened up various stages in the innovation process to its members to contribute in the product development, e.g. product ideation, design, and market research. Throughout the process, members earn influence through participating in the product development. Based on the influence they receive, shares on the product’s turnover. The outcomes of the study’s analysis highlighted certain benefits of open innovation for product development. The paper concludes with recommendations for future research to look into opportunities of open innovation approaches to be adopted by tertiary institutions as a novel way to commercialise research intellectual property.

Keywords: business process, crowdsourced innovation, open innovation, Quirky

Procedia PDF Downloads 105
10 Ionic Liquids as Substrates for Metal-Organic Framework Synthesis

Authors: Julian Mehler, Marcus Fischer, Martin Hartmann, Peter S. Schulz

Abstract:

During the last two decades, the synthesis of metal-organic frameworks (MOFs) has gained ever increasing attention. Based on their pore size and shape as well as host-guest interactions, they are of interest for numerous fields related to porous materials, like catalysis and gas separation. Usually, MOF-synthesis takes place in an organic solvent between room temperature and approximately 220 °C, with mixtures of polyfunctional organic linker molecules and metal precursors as substrates. Reaction temperatures above the boiling point of the solvent, i.e. solvothermal reactions, are run in autoclaves or sealed glass vessels under autogenous pressures. A relatively new approach for the synthesis of MOFs is the so-called ionothermal synthesis route. It applies an ionic liquid as a solvent, which can serve as a structure-directing template and/or a charge-compensating agent in the final coordination polymer structure. Furthermore, this method often allows for less harsh reaction conditions than the solvothermal route. Here a variation of the ionothermal approach is reported, where the ionic liquid also serves as an organic linker source. By using 1-ethyl-3-methylimidazolium terephthalates ([EMIM][Hbdc] and [EMIM]₂[bdc]), the one-step synthesis of MIL-53(Al)/Boehemite composites with interesting features is possible. The resulting material is already formed at moderate temperatures (90-130 °C) and is stabilized in the usually unfavored ht-phase. Additionally, in contrast to already published procedures for MIL-53(Al) synthesis, no further activation at high temperatures is mandatory. A full characterization of this novel composite material is provided, including XRD, SS-NMR, El-Al., SEM as well as sorption measurements and its interesting features are compared to MIL-53(Al) samples produced by the classical solvothermal route. Furthermore, the syntheses of the applied ionic liquids and salts is discussed. The influence of the degree of ionicity of the linker source [EMIM]x[H(2-x)bdc] on the crystal structure and the achievable synthesis temperature are investigated and give insight into the role of the IL during synthesis. Aside from the synthesis of MIL-53 from EMIM terephthalates, the use of the phosphonium cation in this approach is discussed as well. Additionally, the employment of ILs in the preparation of other MOFs is presented briefly. This includes the ZIF-4 framework from the respective imidazolate ILs and chiral camphorate based frameworks from their imidazolium precursors.

Keywords: ionic liquids, ionothermal synthesis, material synthesis, MIL-53, MOFs

Procedia PDF Downloads 115
9 Comparing Stability Index MAPping (SINMAP) Landslide Susceptibility Models in the Río La Carbonera, Southeast Flank of Pico de Orizaba Volcano, Mexico

Authors: Gabriel Legorreta Paulin, Marcus I. Bursik, Lilia Arana Salinas, Fernando Aceves Quesada

Abstract:

In volcanic environments, landslides and debris flows occur continually along stream systems of large stratovolcanoes. This is the case on Pico de Orizaba volcano, the highest mountain in Mexico. The volcano has a great potential to impact and damage human settlements and economic activities by landslides. People living along the lower valleys of Pico de Orizaba volcano are in continuous hazard by the coalescence of upstream landslide sediments that increased the destructive power of debris flows. These debris flows not only produce floods, but also cause the loss of lives and property. Although the importance of assessing such process, there is few landslide inventory maps and landslide susceptibility assessment. As a result in México, no landslide susceptibility models assessment has been conducted to evaluate advantage and disadvantage of models. In this study, a comprehensive study of landslide susceptibility models assessment using GIS technology is carried out on the SE flank of Pico de Orizaba volcano. A detailed multi-temporal landslide inventory map in the watershed is used as framework for the quantitative comparison of two landslide susceptibility maps. The maps are created based on 1) the Stability Index MAPping (SINMAP) model by using default geotechnical parameters and 2) by using findings of volcanic soils geotechnical proprieties obtained in the field. SINMAP combines the factor of safety derived from the infinite slope stability model with the theory of a hydrologic model to produce the susceptibility map. It has been claimed that SINMAP analysis is reasonably successful in defining areas that intuitively appear to be susceptible to landsliding in regions with sparse information. The validations of the resulting susceptibility maps are performed by comparing them with the inventory map under LOGISNET system which provides tools to compare by using a histogram and a contingency table. Results of the experiment allow for establishing how the individual models predict the landslide location, advantages, and limitations. The results also show that although the model tends to improve with the use of calibrated field data, the landslide susceptibility map does not perfectly represent existing landslides.

Keywords: GIS, landslide, modeling, LOGISNET, SINMAP

Procedia PDF Downloads 238