Search results for: computer and information
11189 Use of Information Technology in the Government of a State
Authors: Pavel E. Golosov, Vladimir I. Gorelov, Oksana L. Karelova
Abstract:
There are visible changes in the world organization, environment and health of national conscience that create a background for discussion on possible redefinition of global, state and regional management goals. Authors apply the sustainable development criteria to a hierarchical management scheme that is to lead the world community to non-contradictory growth. Concrete definitions are discussed in respect of decision-making process representing the state mostly. With the help of system analysis it is highlighted how to understand who would carry the distinctive sign of world leadership in the nearest future.Keywords: decision-making, information technology, public administration
Procedia PDF Downloads 51311188 Using Visualization Techniques to Support Common Clinical Tasks in Clinical Documentation
Authors: Jonah Kenei, Elisha Opiyo
Abstract:
Electronic health records, as a repository of patient information, is nowadays the most commonly used technology to record, store and review patient clinical records and perform other clinical tasks. However, the accurate identification and retrieval of relevant information from clinical records is a difficult task due to the unstructured nature of clinical documents, characterized in particular by a lack of clear structure. Therefore, medical practice is facing a challenge thanks to the rapid growth of health information in electronic health records (EHRs), mostly in narrative text form. As a result, it's becoming important to effectively manage the growing amount of data for a single patient. As a result, there is currently a requirement to visualize electronic health records (EHRs) in a way that aids physicians in clinical tasks and medical decision-making. Leveraging text visualization techniques to unstructured clinical narrative texts is a new area of research that aims to provide better information extraction and retrieval to support clinical decision support in scenarios where data generated continues to grow. Clinical datasets in electronic health records (EHR) offer a lot of potential for training accurate statistical models to classify facets of information which can then be used to improve patient care and outcomes. However, in many clinical note datasets, the unstructured nature of clinical texts is a common problem. This paper examines the very issue of getting raw clinical texts and mapping them into meaningful structures that can support healthcare professionals utilizing narrative texts. Our work is the result of a collaborative design process that was aided by empirical data collected through formal usability testing.Keywords: classification, electronic health records, narrative texts, visualization
Procedia PDF Downloads 11811187 Web Map Service for Fragmentary Rockfall Inventory
Authors: M. Amparo Nunez-Andres, Nieves Lantada
Abstract:
One of the most harmful geological risks is rockfalls. They cause both economic lost, damaged in buildings and infrastructures, and personal ones. Therefore, in order to estimate the risk of the exposed elements, it is necessary to know the mechanism of this kind of events, since the characteristics of the rock walls, to the propagation of fragments generated by the initial detached rock mass. In the framework of the research RockModels project, several inventories of rockfalls were carried out along the northeast of the Spanish peninsula and the Mallorca island. These inventories have general information about the events, although the important fact is that they contained detailed information about fragmentation. Specifically, the IBSD (Insitu Block Size Distribution) is obtained by photogrammetry from drone or TLS (Terrestrial Laser Scanner) and the RBSD (Rock Block Size Distribution) from the volume of the fragment in the deposit measured by hand. In order to share all this information with other scientists, engineers, members of civil protection, and stakeholders, it is necessary a platform accessible from the internet and following interoperable standards. In all the process, open-software have been used: PostGIS 2.1., Geoserver, and OpenLayers library. In the first step, a spatial database was implemented to manage all the information. We have used the data specifications of INSPIRE for natural risks adding specific and detailed data about fragmentation distribution. The next step was to develop a WMS with Geoserver. A previous phase was the creation of several views in PostGIS to show the information at different scales of visualization and with different degrees of detail. In the first view, the sites are identified with a point, and basic information about the rockfall event is facilitated. In the next level of zoom, at medium scale, the convex hull of the rockfall appears with its real shape and the source of the event and fragments are represented by symbols. The queries at this level offer a major detail about the movement. Eventually, the third level shows all elements: deposit, source, and blocks, in their real size, if it is possible, and in their real localization. The last task was the publication of all information in a web mapping site (www.rockdb.upc.edu) with data classified by levels using libraries in JavaScript as OpenLayers.Keywords: geological risk, web mapping, WMS, rockfalls
Procedia PDF Downloads 16011186 Using Electrical Impedance Tomography to Control a Robot
Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi
Abstract:
Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.Keywords: electrical impedance tomography, EIT, surgeon robot, image processing of electrical impedance tomography
Procedia PDF Downloads 27211185 Health Portals for Specific Populations: A Design for Pregnant Women
Authors: Janine Sommer, Mariana Daus, Mariana Simon, Maria Smith, Daniel Luna
Abstract:
The technologies and communication advances contributed to new tools development which allows patients to have an active role in their own health. In the light of information needs and paradigms changes about health, the patient self-manages their care. This line of care focuses on patients; specific portals come up to people with particular requirements like pregnant women. Thinking of a portal design to this sector of the population, in September 2016 a survey was made to users with the objective to knowing and understanding information’s needs at the moment to use an application for pregnant. Also, prototypes of the portal´s features were designed to try and validate with users, using the methodology of human-centered design. Investigations have made possible the identification of needs of this population and develop a tool who try to satisfy, providing timely information for each part of pregnancy and allowing the patients to make a physical check and the follow up of pregnancy seeking advice from our obstetricians.Keywords: electronic health record, health personal record, mobile applications, pregnant women
Procedia PDF Downloads 35111184 Simulation of Optimum Sculling Angle for Adaptive Rowing
Authors: Pornthep Rachnavy
Abstract:
The purpose of this paper is twofold. First, we believe that there are a significant relationship between sculling angle and sculling style among adaptive rowing. Second, we introduce a methodology used for adaptive rowing, namely simulation, to identify effectiveness of adaptive rowing. For our study we simulate the arms only single scull of adaptive rowing. The method for rowing fastest under the 1000 meter was investigated by study sculling angle using the simulation modeling. A simulation model of a rowing system was developed using the Matlab software package base on equations of motion consist of many variation for moving the boat such as oars length, blade velocity and sculling style. The boat speed, power and energy consumption on the system were compute. This simulation modeling can predict the force acting on the boat. The optimum sculling angle was performing by computer simulation for compute the solution. Input to the model are sculling style of each rower and sculling angle. Outputs of the model are boat velocity at 1000 meter. The present study suggests that the optimum sculling angle exist depends on sculling styles. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the first style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the second style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the third style is -51.57 and 28.65 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the fourth style is -45.84 and 34.38 degree. A theoretical simulation for rowing has been developed and presented. The results suggest that it may be advantageous for the rowers to select the sculling angles proper to sculling styles. The optimum sculling angles of the rower depends on the sculling styles made by each rower. The investigated of this paper can be concludes in three directions: 1;. There is the optimum sculling angle in arms only single scull of adaptive rowing. 2. The optimum sculling angles depend on the sculling styles. 3. Computer simulation of rowing can identify opportunities for improving rowing performance by utilizing the kinematic description of rowing. The freedom to explore alternatives in speed, thrust and timing with the computer simulation will provide the coach with a tool for systematic assessments of rowing technique In addition, the ability to use the computer to examine the very complex movements during rowing will help both the rower and the coach to conceptualize the components of movements that may have been previously unclear or even undefined.Keywords: simulation, sculling, adaptive, rowing
Procedia PDF Downloads 46511183 Monitoring Co-Creation: A Survey of Lithuanian Urban Communities
Authors: Aelita Skarzauskiene, Monika Maciuliene
Abstract:
In this paper, we conduct a systematic survey of urban communities in Lithuania to evaluate their potential to co-create collective intelligence or “civic intelligence” applying Digital Co-creation Index methodology that includes different socio-technological indicators. Civic intelligence is a form of collective intelligence that refers to the group’s capacity to perceive societal problems and to address them effectively. The research focuses on evaluation of diverse organizational designs that increase efficient collective performance. The current scientific project advanced the state of the art by evaluating the basic preconditions in the urban communities through which the collective intelligence is being co-created under the systemic manner. The research subject is the “bottom up” digital enabled urban platforms, initiated by Lithuanian public organizations, civic movements or business entities. The web-based monitoring results obtained by applying a social indices calculation methodology and Pearson correlation analysis provided the information about the potential and limits of the urban communities and what possible changes need to be implemented to overcome the limitations.Keywords: computer supported collaboration, socio-technological system, collective intelligence, networked society
Procedia PDF Downloads 20311182 Quantification of the Variables of the Information Model for the Use of School Terminology from 1884 to 2014 in Dalmatia
Authors: Vinko Vidučić, Tanja Brešan Ančić, Marijana Tomelić Ćurlin
Abstract:
Prior to quantifying the variables of the information model for using school terminology in Croatia's region of Dalmatia from 1884 to 2014, the most relevant model variables had to be determined: historical circumstances, standard of living, education system, linguistic situation, and media. The research findings show that there was no significant transfer of the 1884 school terms into 1949 usage; likewise, the 1949 school terms were not widely used in 2014. On the other hand, the research revealed that the meaning of school terms changed over the decades. The quantification of the variables will serve as the groundwork for creating an information model for using school terminology in Dalmatia from 1884 to 2014 and for defining direct growth rates in further research.Keywords: education system, historical circumstances, linguistic situation, media, school terminology, standard of living
Procedia PDF Downloads 21611181 Possibilities of Output Technology the Project ADAPTIV for Use in Infrared Camouflage
Authors: Jiří Barta, Teodor Baláž, Tomáš Ludík, Jiří. F. Urbánek
Abstract:
This article deals with the outputs of project acronym ADAPTIV of Czech Defence Research Project. This Project solved tends to adaptive camouflage. The camouflage is concealment by means of disguise. Perceptive interface between recipient and camouflaged object is visualized by means of textile modular screens. Screens special light semi-permeability enables front/ back projection with nearly identical light parameters. Information permeability, towards illusion creation, must be controlled by the camouflage provider by means sophisticated and mastered illusion with perfect scenarios. The project ADAPTIV was primarily funded with the maximum possible use of COTS (Commercial-Off-The-Shelf) principle asks special definition of feasibility conditions, especially recipient space position. This paper deals with uses the ADAPTIV output with name DATAsam with modification for infrared camouflage. It is focused on active camouflage in infrared spectrum of emissivity at <8;14> μm for laboratory conditions. The main chapter provides basic experiments and testing physical properties needed for camouflage in infrared environment. The evaluation experiments revealed the possibility of use case in various types of camouflage.Keywords: camouflage, ADAPTIV, infrared camouflage, computer-aided, COTS
Procedia PDF Downloads 41711180 Automatic API Regression Analyzer and Executor
Authors: Praveena Sridhar, Nihar Devathi, Parikshit Chakraborty
Abstract:
As the software product changes versions across releases, there are changes to the API’s and features and the upgrades become necessary. Hence, it becomes imperative to get the impact of upgrading the dependent components. This tool finds out API changes across two versions and their impact on other API’s followed by execution of the automated regression suites relevant to updates and their impacted areas. This tool has 4 layer architecture, each layer with its own unique pre-assigned capability which it does and sends the required information to next layer. This are the 4 layers. 1) Comparator: Compares the two versions of API. 2) Analyzer: Analyses the API doc and gives the modified class and its dependencies along with implemented interface details. 3) Impact Filter: Find the impact of the modified class on the other API methods. 4) Auto Executer: Based on the output given by Impact Filter, Executor will run the API regression Suite. Tool reads the java doc and extracts the required information of classes, interfaces and enumerations. The extracted information is saved into a data structure which shows the class details and its dependencies along with interfaces and enumerations that are listed in the java doc.Keywords: automation impact regression, java doc, executor, analyzer, layers
Procedia PDF Downloads 48811179 Digital Musical Organology: The Audio Games: The Question of “A-Musicological” Interfaces
Authors: Hervé Zénouda
Abstract:
This article seeks to shed light on an emerging creative field: "Audio games," at the crossroads between video games and computer music. Indeed, many applications, which propose entertaining audio-visual experiences with the objective of musical creation, are available today for different supports (game consoles, computers, cell phones). The originality of this field is the use of the gameplay of video games applied to music composition. Thus, composing music using interfaces but also cognitive logics that we qualify as "a-musicological" seem to us particularly interesting from the perspective of musical digital organology. This field raises questions about the representation of sound and musical structures and develops new instrumental gestures and strategies of musical composition. We will try in this article to define the characteristics of this field by highlighting some historical milestones (abstract cinema, game theory in music, actions, and graphic scores) as well as the novelties brought by digital technologies.Keywords: audio-games, video games, computer generated music, gameplay, interactivity, synesthesia, sound interfaces, relationships image/sound, audiovisual music
Procedia PDF Downloads 11211178 “Moves” for Guiding Presentations in French
Authors: Nuchanat Handumrongkul, Suwaree Yordchim, Anantachai Aeka
Abstract:
Despite four years of study in the tourism industry, the Bachelor’s graduates cannot perform their jobs as experienced tour guides. This research aimed to develop French teaching and studying for Tourism with two main purposes: to analyze ‘Moves’ used in oral presentations at tourist attractions; and to study content in guiding presentations or 'Guide Speak'. The study employed audio recording of these presentations as an interview method in authentic situations, having four tour guides as respondents and information providers. The data was analyzed via moves and content analysis. The results found that there were eight moves used; namely: welcoming, introducing oneself, drawing someone’s attention, giving information, explaining, highlighting, persuading, and saying goodbye. In terms of content, the information being presented covered the outstanding characteristics of the places and well-integrated with other related content. The findings were used as guidelines for curriculum development; in particular, the core content and the presentation forming the basis for students to meet the standard requirements of the labor-market and professional schemes.Keywords: moves, guiding presentation, french, tourism
Procedia PDF Downloads 23211177 Examining Relationship between Programming Performance, Programming Self Efficacy and Math Success
Authors: Mustafa Ekici, Sacide Güzin Mazman
Abstract:
Programming is the one of ability in computer science fields which is generally perceived difficult by students and various individual differences have been implicated in that ability success. Although several factors that affect programming ability have been identified over the years, there is not still a full understanding of why some students learn to program easily and quickly while others find it complex and difficult. Programming self-efficacy and mathematic success are two of those essential individual differences which are handled as having important effect on the programming success. This study aimed to identify the relationship between programming performance, programming self efficacy and mathematics success. The study group is consisted of 96 undergraduates from Department of Econometrics of Uşak University. 38 (39,58%) of the participants are female while 58 (60,41%) of them are male. Study was conducted in the programming-I course during 2014-2015 fall term. Data collection tools are comprised of programming course final grades, programming self efficacy scale and a mathematics achievement test. Data was analyzed through correlation analysis. The result of study will be reported in the full text of the study.Keywords: programming performance, self efficacy, mathematic success, computer science
Procedia PDF Downloads 50211176 Improved Safety Science: Utilizing a Design Hierarchy
Authors: Ulrica Pettersson
Abstract:
Collection of information on incidents is regularly done through pre-printed incident report forms. These tend to be incomplete and frequently lack essential information. ne consequence is that reports with inadequate information, that do not fulfil analysts’ requirements, are transferred into the analysis process. To improve an incident reporting form, theory in design science, witness psychology and interview and questionnaire research has been used. Previously three experiments have been conducted to evaluate the form and shown significant improved results. The form has proved to capture knowledge, regardless of the incidents’ character or context. The aim in this paper is to describe how design science, in more detail a design hierarchy can be used to construct a collection form for improvements in safety science.
Keywords: data collection, design science, incident reports, safety science
Procedia PDF Downloads 22311175 DNA PLA: A Nano-Biotechnological Programmable Device
Authors: Hafiz Md. HasanBabu, Khandaker Mohammad Mohi Uddin, Md. IstiakJaman Ami, Rahat Hossain Faisal
Abstract:
Computing in biomolecular programming performs through the different types of reactions. Proteins and nucleic acids are used to store the information generated by biomolecular programming. DNA (Deoxyribose Nucleic Acid) can be used to build a molecular computing system and operating system for its predictable molecular behavior property. The DNA device has clear advantages over conventional devices when applied to problems that can be divided into separate, non-sequential tasks. The reason is that DNA strands can hold so much data in memory and conduct multiple operations at once, thus solving decomposable problems much faster. Programmable Logic Array, abbreviated as PLA is a programmable device having programmable AND operations and OR operations. In this paper, a DNA PLA is designed by different molecular operations using DNA molecules with the proposed algorithms. The molecular PLA could take advantage of DNA's physical properties to store information and perform calculations. These include extremely dense information storage, enormous parallelism, and extraordinary energy efficiency.Keywords: biological systems, DNA computing, parallel computing, programmable logic array, PLA, DNA
Procedia PDF Downloads 13011174 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data
Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau
Abstract:
Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.Keywords: calcium imaging, computer vision, neural activity, neural networks
Procedia PDF Downloads 8211173 An Application of Geographic Information System to Select Areas for Sanitary Landfill in Bang Nok- Khwaek Municipality
Authors: Musthaya Patchanee
Abstract:
The study of Sanitary landfill in Bang Nok-khwaek municipality consists of two procedures. First, to survey and create the spatial database by using physical factor, environmental factor, economical factor and social factor to follow the method of Geographic information system: GIS, second, to analyze the proper spatial for allocating the sanitary landfill in Bang Nok-khwaek municipality by using Overlay techniques to calculate the weighting linear total in Arc GIS program. The study found that there are 2.49 sq.km. proper spatial for the sanitary landfill in Bang Nok-khwaek municipals city which is 66.76% of the whole area. The highest proper spatial is 0.02 sq.km. which is 0.54%, The high proper spatial is 0.3 sq.km. which is 8.04%, the moderate spatial is 1.62 sq.km. which is 43.43% and the low proper spatial is 0.55 sq.km. which is 14.75%. These results will be used as the guideline to select the sanitary landfill area in accordance with sanitation standard for Subdistrict Administrative Organization and Subbdistrict Municipality in Samut Songkhram provice.Keywords: Geographic Information System (GIS), sanitary landfill, Bang Nok-Khwaek municipality, Subdistrict Administrative Organization
Procedia PDF Downloads 39311172 Blind Channel Estimation for Frequency Hopping System Using Subspace Based Method
Authors: M. M. Qasaymeh, M. A. Khodeir
Abstract:
Subspace channel estimation methods have been studied widely. It depends on subspace decomposition of the covariance matrix to separate signal subspace from noise subspace. The decomposition normally is done by either Eigenvalue Decomposition (EVD) or Singular Value Decomposition (SVD) of the Auto-Correlation matrix (ACM). However, the subspace decomposition process is computationally expensive. In this paper, the multipath channel estimation problem for a Slow Frequency Hopping (SFH) system using noise space based method is considered. An efficient method to estimate multipath the time delays basically is proposed, by applying MUltiple Signal Classification (MUSIC) algorithm which used the null space extracted by the Rank Revealing LU factorization (RRLU). The RRLU provides accurate information about the rank and the numerical null space which make it a valuable tool in numerical linear algebra. The proposed novel method decreases the computational complexity approximately to the half compared with RRQR methods keeping the same performance. Computer simulations are also included to demonstrate the effectiveness of the proposed scheme.Keywords: frequency hopping, channel model, time delay estimation, RRLU, RRQR, MUSIC, LS-ESPRIT
Procedia PDF Downloads 41011171 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection
Authors: S. Delgado, C. Cerrada, R. S. Gómez
Abstract:
This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.Keywords: voxelization, GPU acceleration, computer graphics, compute shaders
Procedia PDF Downloads 7311170 Designing a Corpus Database to Enhance the Learning of Old English Language
Authors: Raquel Mateo Mendaza, Carmen Novo Urraca
Abstract:
The current paper presents the elaboration of a corpus database that aligns two different corpora in order to simplify the search of information both for researchers and students of Old English. This database comprises the information contained in two main reference corpora, namely the Dictionary of Old English Corpus (DOEC), compiled at the University of Toronto, and the York-Toronto-Helsinki Parsed Corpus of Old English (YCOE). The first one provides information on all surviving texts written in the Old English language. The latter offers the syntactical and morphological annotation of several texts included in the DOEC. Although both corpora are closely related, as the YCOE includes the DOE source text identifier, the main problem detected is that there is not an alignment of texts that allows for the search of whole fragments to be further analysed in terms of morphology and syntax. The database proposed in this paper gathers all this information and presents it in a simple, more accessible, visual, and educational way. The alignment of fragments has been done in an automatized way. However, some problems have emerged during the creating process particularly related to the lack of correspondence in the division of fragments. For this reason, it has been necessary to revise the whole entries manually to obtain a truthful high-quality product and to carefully indicate the gaps encountered in these corpora. All in all, this database contains more than 60,000 entries corresponding with the DOE fragments annotated by the YCOE. The main strength of the resulting product is its research and teaching implications in the study of Old English. The use of this database will help researchers and students in the study of different aspects of the language, such as inflectional morphology, syntactic behaviour of given words, or translation studies, among others. By means of the search of words or fragments, the annotated information on morphology and syntax will be automatically displayed, automatizing, and speeding up the search of data.Keywords: alignment, corpus database, morphosyntactic analysis, Old English
Procedia PDF Downloads 13411169 Computer Simulation Studies of Spinel LiMn₂O₄ Nanotubes
Authors: D. M. Tshwane, R. R. Maphanga, P. E. Ngoepe
Abstract:
Nanostructured materials are attractive candidates for efficient electrochemical energy storage devices because of their unique physicochemical properties. Nanotubes have drawn a continuous attention because of their unique electrical, optical and magnetic properties contrast to that of bulk system. They have potential application in the field of optical, electronics and energy storage device. Introducing nanotubes structures as electrode materials; represents one of the most attractive strategies that could dramatically enhance the battery performance. Spinel LiMn2O4 is the most promising cathode material for Li-ion batteries. In this work, computer simulation methods are used to generate and investigate properties of spinel LiMn2O4 nanotubes. Molecular dynamic simulation is used to probe the local structure of LiMn2O4 nanotubes and the effect of temperature on these systems. It is found that diameter, Miller indices and size have a direct control on nanotubes morphology. Furthermore, it is noted that stability depends on surface and wrapping of the nanotube. The nanotube structures are described using the radial distribution function and XRD patterns. There is a correlation between calculated XRD and experimentally reported results.Keywords: LiMn2O4, li-ion batteries, nanotubes, nanostructures
Procedia PDF Downloads 18911168 Investigating Relationship between Use of Mobile Technologies and Employees’ Creativity
Authors: Leila Niroomand, Reza Rafigh
Abstract:
Nowadays, the world is going under a dramatic change from an industry-centered society to an information-centered one. In other words, we are experiencing a transition from real, physical world into a virtual one. Stepping into the information age and running an effective life within the information-centered society demands getting acquainted with characteristics peculiar to such society. Recently, new technologies such as telecommunication and mobile technologies have changed vehemently and accumulation of achievements and information has become so important and brought about changes in occupational structures. The intellectual structure of this day and age depends on deep attention to creative and knowledge-based human resource collaboration instead of merely functioning human resource. Present study scrutinizes the contribution of different dimensions of mobile technologies including perceived use, perceived enjoyment, continuance intention, confirmation and satisfaction to the creativity of personnel. The statistical population included infrastructure communications company employees totaling 2431 persons out of which 331 individuals were chosen as sample based on Morgan and Krejcie table. This research is descriptive and the questionnaire was used for data gathering and it was distributed among those who used telegram application. 228 questionnaires were analyzed by the researcher. Applying SPSS software, Pierson correlation coefficient was analyzed and it was found out that all dimensions of mobile technologies except satisfaction correlate with the creativity of employees.Keywords: mobile technologies, continuance intention, perceived enjoyment, confirmation, satisfaction, creativity, perceived use
Procedia PDF Downloads 20311167 Object Oriented Software Engineering Approach to Industrial Information System Design and Implementation
Authors: Issa Hussein Manita
Abstract:
This paper presents an example of industrial information system design and implementation (IIDC), the most common software engineering design steps that are applied to the different design stages. We are going through the life cycle of software system development. We start by a study of system requirement and end with testing and delivering system, going by system design and coding, program integration and system integration step. The most modern software design tools available used in the design this includes, but not limited to, Unified Modeling Language (UML), system modeling, SQL server side application, uses case analysis, design and testing as applied to information processing systems. The system is designed to perform tasks specified by the client with real data. By the end of the implementation of the system, default or user defined acceptance policy to provide an overall score as an indication of the system performance is used. To test the reliability of he designed system, it is tested in different environment and different work burden such as multi-user environment.Keywords: software engineering, design, system requirement, integration, unified modeling language
Procedia PDF Downloads 57011166 The Retrospective Investigation of the Impacts of Alien Taxa on Human Health: A Case Study of Two Poison Information Centers
Authors: Moleseng Claude Moshobane
Abstract:
Alien species cause considerable negative impacts on biodiversity, economy and public health. Impacts of alien species on public health have received a degree of attention worldwide, largely in developed countries, but scarce in developing countries. Here, we provide a review of human exposures and poisonings cases from native and alien plant species reported to poison information centers. A retrospective review of the Tygerberg Poison Information Centre (TPIC) and Poisons Information Centre (PIC) at Red Cross War Memorial Children's Hospital (RCWMCH) was conducted over approximately 2-year period (1 June 2015 through to 06 March 2017). Combined, TPIC and PIC handled 626 cases during the 2-year period. Toxicity cases were more abundant in Gauteng (47.1%), followed by Western Cape (29.4%). The primary mechanism of injury was ingestion (96.7%), and all cases were predominantly accidental. Most reported cases involved infants (20.6%), with few fully-grown adults related cases (5.8%). Adults presented minor to moderate toxicity, while infants none to minor toxicity. We conclude that reported toxicity cases on human health are biased towards few alien species and that several cases relate to unknown species of mushrooms. Public awareness is essential to reducing the poisoning incidences.Keywords: alien species, poisoning, invasive species, public health
Procedia PDF Downloads 18511165 Design of a Photovoltaic Power Generation System Based on Artificial Intelligence and Internet of Things
Authors: Wei Hu, Wenguang Chen, Chong Dong
Abstract:
In order to improve the efficiency and safety of photovoltaic power generation devices, this photovoltaic power generation system combines Artificial Intelligence (AI) and the Internet of Things (IoT) to control the chasing photovoltaic power generation device to track the sun to improve power generation efficiency and then convert energy management. The system uses artificial intelligence as the control terminal, the power generation device executive end uses the Linux system, and Exynos4412 is the CPU. The power generating device collects the sun image information through Sony CCD. After several power generating devices feedback the data to the CPU for processing, several CPUs send the data to the artificial intelligence control terminal through the Internet. The control terminal integrates the executive terminal information, time information, and environmental information to decide whether to generate electricity normally and then whether to convert the converted electrical energy into the grid or store it in the battery pack. When the power generation environment is abnormal, the control terminal authorizes the protection strategy, the power generation device executive terminal stops power generation and enters a self-protection posture, and at the same time, the control terminal synchronizes the data with the cloud. At the same time, the system is more intelligent, more adaptive, and longer life.Keywords: photo-voltaic power generation, the pursuit of light, artificial intelligence, internet of things, photovoltaic array, power management
Procedia PDF Downloads 12311164 Agent-Based Modeling Investigating Self-Organization in Open, Non-equilibrium Thermodynamic Systems
Authors: Georgi Y. Georgiev, Matthew Brouillet
Abstract:
This research applies the power of agent-based modeling to a pivotal question at the intersection of biology, computer science, physics, and complex systems theory about the self-organization processes in open, complex, non-equilibrium thermodynamic systems. Central to this investigation is the principle of Maximum Entropy Production (MEP). This principle suggests that such systems evolve toward states that optimize entropy production, leading to the formation of structured environments. It is hypothesized that guided by the least action principle, open thermodynamic systems identify and follow the shortest paths to transmit energy and matter, resulting in maximal entropy production, internal structure formation, and a decrease in internal entropy. Concurrently, it is predicted that there will be an increase in system information as more information is required to describe the developing structure. To test this, an agent-based model is developed simulating an ant colony's formation of a path between a food source and its nest. Utilizing the Netlogo software for modeling and Python for data analysis and visualization, self-organization is quantified by calculating the decrease in system entropy based on the potential states and distribution of the ants within the simulated environment. External entropy production is also evaluated for information increase and efficiency improvements in the system's action. Simulations demonstrated that the system begins at maximal entropy, which decreases as the ants form paths over time. A range of system behaviors contingent upon the number of ants are observed. Notably, no path formation occurred with fewer than five ants, whereas clear paths were established by 200 ants, and saturation of path formation and entropy state was reached at populations exceeding 1000 ants. This analytical approach identified the inflection point marking the transition from disorder to order and computed the slope at this point. Combined with extrapolation to the final path entropy, these parameters yield important insights into the eventual entropy state of the system and the timeframe for its establishment, enabling the estimation of the self-organization rate. This study provides a novel perspective on the exploration of self-organization in thermodynamic systems, establishing a correlation between internal entropy decrease rate and external entropy production rate. Moreover, it presents a flexible framework for assessing the impact of external factors like changes in world size, path obstacles, and friction. Overall, this research offers a robust, replicable model for studying self-organization processes in any open thermodynamic system. As such, it provides a foundation for further in-depth exploration of the complex behaviors of these systems and contributes to the development of more efficient self-organizing systems across various scientific fields.Keywords: complexity, self-organization, agent based modelling, efficiency
Procedia PDF Downloads 6811163 Management of Interdependence in Manufacturing Networks
Authors: Atour Taghipour
Abstract:
In the real world each manufacturing company is an independent business unit. These business units are linked to each other through upstream and downstream linkages. The management of these linkages is called coordination which, could be considered as a difficult engineering task. The degree of difficulty of coordination depends on the type and the nature of information exchanged between partners as well as the structure of relationship from mutual to the network structure. The literature of manufacturing systems comprises a wide range of varieties of methods and approaches of coordination. In fact, two main streams of research can be distinguished: central coordination versus decentralized coordination. In the centralized systems a high degree of information exchanges is required. The high degree of information exchanges sometimes leads to difficulties when independent members do not want to share information. In order to address these difficulties, decentralized approaches of coordination of operations planning decisions based on some minimal information sharing have been proposed in many academic disciplines. This paper first proposes a framework of analysis in order to analyze the proposed approaches in the literature, based on this framework which includes the similarities between approaches we categorize the existing approaches. This classification can be used as a research map for future researches. The result of our paper highlights several opportunities for future research. First, it is proposed to develop more dynamic and stochastic mechanisms of planning coordination of manufacturing units. Second, in order to exploit the complementarities of approaches proposed by diverse science discipline, we propose to integrate the techniques of coordination. Finally, based on our approach we proposed to develop coordination standards to guaranty both the complementarity of these approaches as well as the freedom of companies to adopt any planning tools.Keywords: network coordination, manufacturing, operations planning, supply chain
Procedia PDF Downloads 28211162 Artificial Intelligence in Management Simulators
Authors: Nuno Biga
Abstract:
Artificial Intelligence (AI) has the potential to transform management into several impactful ways. It allows machines to interpret information to find patterns in big data and learn from context analysis, optimize operations, make predictions sensitive to each specific situation and support data-driven decision making. The introduction of an 'artificial brain' in organization also enables learning through complex information and data provided by those who train it, namely its users. The "Assisted-BIGAMES" version of the Accident & Emergency (A&E) simulator introduces the concept of a "Virtual Assistant" (VA) sensitive to context, that provides users useful suggestions to pursue the following operations such as: a) to relocate workstations in order to shorten travelled distances and minimize the stress of those involved; b) to identify in real time existing bottleneck(s) in the operations system so that it is possible to quickly act upon them; c) to identify resources that should be polyvalent so that the system can be more efficient; d) to identify in which specific processes it may be advantageous to establish partnership with other teams; and e) to assess possible solutions based on the suggested KPIs allowing action monitoring to guide the (re)definition of future strategies. This paper is built on the BIGAMES© simulator and presents the conceptual AI model developed and demonstrated through a pilot project (BIG-AI). Each Virtual Assisted BIGAME is a management simulator developed by the author that guides operational and strategic decision making, providing users with useful information in the form of management recommendations that make it possible to predict the actual outcome of different alternative management strategic actions. The pilot project developed incorporates results from 12 editions of the BIGAME A&E that took place between 2017 and 2022 at AESE Business School, based on the compilation of data that allows establishing causal relationships between decisions taken and results obtained. The systemic analysis and interpretation of data is powered in the Assisted-BIGAMES through a computer application called "BIGAMES Virtual Assistant" (VA) that players can use during the Game. Each participant in the VA permanently asks himself about the decisions he should make during the game to win the competition. To this end, the role of the VA of each team consists in guiding the players to be more effective in their decision making, through presenting recommendations based on AI methods. It is important to note that the VA's suggestions for action can be accepted or rejected by the managers of each team, as they gain a better understanding of the issues along time, reflect on good practice and rely on their own experience, capability and knowledge to support their own decisions. Preliminary results show that the introduction of the VA provides a faster learning of the decision-making process. The facilitator designated as “Serious Game Controller” (SGC) is responsible for supporting the players with further analysis. The recommended actions by the SGC may differ or be similar to the ones previously provided by the VA, ensuring a higher degree of robustness in decision-making. Additionally, all the information should be jointly analyzed and assessed by each player, who are expected to add “Emotional Intelligence”, an essential component absent from the machine learning process.Keywords: artificial intelligence, gamification, key performance indicators, machine learning, management simulators, serious games, virtual assistant
Procedia PDF Downloads 10511161 Digital Literacy Skills for Geologist in Public Sector
Authors: Angsumalin Puntho
Abstract:
Disruptive technology has had a great influence on our everyday lives and the existence of an organization. Geologists in the public sector need to keep up with digital technology and be able to work and collaborate in a more effective manner. The result from SWOT and 7S McKinsey analyses suggest that there are inadequate IT personnel, no individual digital literacy development plan, and a misunderstanding of management policies. The Office of Civil Service Commission develops digital literacy skills that civil servants and government officers should possess in order to work effectively; it consists of nine dimensions, including computer skills, internet skills, cyber security awareness, word processing, spreadsheets, presentation programs, online collaboration, graphics editors and cyber security practices; and six steps of digital literacy development including self-assessment, individual development plan, self-learning, certified test, learning reflection, and practices. Geologists can use digital literacy as a learning tool to develop themselves for better career opportunities.Keywords: disruptive technology, digital technology, digital literacy, computer skills
Procedia PDF Downloads 11611160 The Evolution of the Israel Defence Forces’ Information Operations: A Case Study of the Israel Defence Forces' Activities in the Information Domain 2006–2014
Authors: Teemu Saressalo
Abstract:
This article examines the evolution of the Israel Defence Forces’ information operation activities during an eight-year timespan from the 2006 war with Hezbollah to more recent operations such as Pillar of Defence and Protective Edge. To this end, the case study will show a change in the Israel Defence Forces’ activities in the information domain. In the 2006 war with Hezbollah in Lebanon, Israel inflicted enormous damage on the Lebanese infrastructure, leaving more than 1,200 people dead and 4,400 injured. Casualties among Hezbollah, Israel’s main adversary, were estimated to range from 250 to 700 fighters. Damage to the Lebanese infrastructure was estimated at over USD 2.5bn, with almost 2,000 houses and buildings damaged and destroyed. Even this amount of destruction did not force Hezbollah to yield and while both sides were claiming victory in the war, Israel paid a heavier price in political backlashes and loss of reputation, mainly due to failures in the media and the way in which the war was portrayed and perceived in Israel and abroad. Much of this can be credited to Hezbollah’s efficient use of the media, and Israel’s failure to do so. Israel managed the next conflict it was engaged in completely differently – it had learnt its lessons and built up new ways to counter its adversary’s propaganda and media operations. In Operation Cast Lead at the turn of 2009, Hamas, Israel’s adversary and Gaza’s dominating faction, was not able to utilize the media in the same way that Hezbollah had. By creating a virtual and physical barrier around the Gaza Strip, Israel almost totally denied its adversary access to the worldwide media, and by restricting the movement of journalists in the area, Israel could let its voice be heard above all. The operation Cast Lead began with a deception operation, which caught Hamas totally off guard. The 21-day campaign left the Gaza Strip devastated, but did not cause as much protest in Israel during the operation as the 2006 war did, mainly due to almost total Israeli dominance in the information dimension. The most important outcome from the Israeli perspective was the fact that Operation Cast Lead was assessed to be a success and the operation enjoyed domestic support along with support from many western nations, which had condemned Israeli actions in the 2006 war. Later conflicts have shown the same tendency towards virtually total dominance in the information domain, which has had an impact on target audiences across the world. Thus, it is clear that well-planned and conducted information operations are able to shape public opinion and influence decision-makers, although Israel might have been outpaced by its rivals.Keywords: Hamas, Hezbollah, information operations, Israel Defence Forces
Procedia PDF Downloads 237