Search results for: distributed memory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3117

Search results for: distributed memory

2367 A Case Study of An Artist Diagnosed with Schizophrenia-Using the Graphic Rorschach (Digital version) “GRD”

Authors: Maiko Kiyohara, Toshiki Ito

Abstract:

In this study, we used a psychotherapy process for patient with dissociative disorder and the graphic Rorschach (Digital version) (GRD). A dissociative disorder is a type of dissociation characterized by multiple alternating personalities (also called alternate identity or another identity). "dissociation" is a state in which consciousness, memory, thinking, emotion, perception, behavior, body image, and so on are divided and experienced. Dissociation symptoms, such as lack of memory, are seen, and the repetition of blanks in daily events causes serious problems in life. Although the pathological mechanism of dissociation has not yet been fully elucidated, it is said that it is caused by childhood abuse or shocking trauma. In case of Japan, no reliable data has been reported on the number of patients and prevalence of dissociative disorders, no drug is compatible with dissociation symptoms, and no clear treatment has been established. GRD is a method that the author revised in 2017 to a Graphic Rorschach, which is a special technique for subjects to draw language responses when enforce Rorschach. GRD reduces the burden on both the subject and the examiner, reduces the complexity of organizing data, improves the simplicity of organizing data, and improves the accuracy of interpretation by introducing a tablet computer during the drawing reaction. We are conducting research for the purpose. The patient in this case is a woman in her 50s, and has multiple personalities since childhood. At present, there are about 10 personalities whose main personality is just grasped. The patients is raising her junior high school sons as single parent, but personal changes often occur at home, which makes the home environment inferior and economically oppressive, and has severely hindered daily life. In psychotherapy, while a personality different from the main personality has appeared, I have also conducted psychotherapy with her son. In this case, the psychotherapy process and the GRD were performed to understand the personality characteristics, and the possibility of therapeutic significance to personality integration is reported.

Keywords: GRD, dissociative disorder, a case study of psychotherapy process, dissociation

Procedia PDF Downloads 117
2366 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory

Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock

Abstract:

Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.

Keywords: subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing

Procedia PDF Downloads 130
2365 Optimal Wind Based DG Placement Considering Monthly Changes Modeling in Wind Speed

Authors: Belal Mohamadi Kalesar, Raouf Hasanpour

Abstract:

Proper placement of Distributed Generation (DG) units such as wind turbine generators in distribution system are still very challenging issue for obtaining their maximum potential benefits because inappropriate placement may increase the system losses. This paper proposes Particle Swarm Optimization (PSO) technique for optimal placement of wind based DG (WDG) in the primary distribution system to reduce energy losses and voltage profile improvement with four different wind levels modeling in year duration. Also, wind turbine is modeled as a DFIG that will be operated at unity power factor and only one wind turbine tower will be considered to install at each bus of network. Finally, proposed method will be implemented on widely used 69 bus power distribution system in MATLAB software environment under four scenario (without, one, two and three WDG units) and for capability test of implemented program it is supposed that all buses of standard system can be candidate for WDG installing (large search space), though this program can consider predetermined number of candidate location in WDG placement to model financial limitation of project. Obtained results illustrate that wind speed increasing in some months will increase output power generated but this can increase / decrease power loss in some wind level, also results show that it is required about 3MW WDG capacity to install in different buses but when this is distributed in overall network (more number of WDG) it can cause better solution from point of view of power loss and voltage profile.

Keywords: wind turbine, DG placement, wind levels effect, PSO algorithm

Procedia PDF Downloads 448
2364 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection

Authors: S. Shankar Bharathi

Abstract:

Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.

Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision

Procedia PDF Downloads 427
2363 Genome-Wide Analysis of Long Terminal Repeat (LTR) Retrotransposons in Rabbit (Oryctolagus cuniculus)

Authors: Zeeshan Khan, Faisal Nouroz, Shumaila Noureen

Abstract:

European or common rabbit (Oryctolagus cuniculus) belongs to class Mammalia, order Lagomorpha of family Leporidae. They are distributed worldwide and are native to Europe (France, Spain and Portugal) and Africa (Morocco and Algeria). LTR retrotransposons are major Class I mobile genetic elements of eukaryotic genomes and play a crucial role in genome expansion, evolution and diversification. They were mostly annotated in various genomes by conventional approaches of homology searches, which restricted the annotation of novel elements. Present work involved de novo identification of LTR retrotransposons by LTR_FINDER in haploid genome of rabbit (2247.74 Mb) distributed in 22 chromosomes, of which 7,933 putative full-length or partial copies were identified containing 69.38 Mb of elements, accounting 3.08% of the genome. Highest copy numbers (731) were found on chromosome 7, followed by chromosome 12 (705), while the lowest copy numbers (27) were detected in chromosome 19 with no elements identified from chromosome 21 due to partially sequenced chromosome, unidentified nucleotides (N) and repeated simple sequence repeats (SSRs). The identified elements ranged in sizes from 1.2 - 25.8 Kb with average sizes between 2-10 Kb. Highest percentage (4.77%) of elements was found in chromosome 15, while lowest (0.55%) in chromosome 19. The most frequent tRNA type was Arginine present in majority of the elements. Based on gained results, it was estimated that rabbit exhibits 15,866 copies having 137.73 Mb of elements accounting 6.16% of diploid genome (44 chromosomes). Further molecular analyses will be helpful in chromosomal localization and distribution of these elements on chromosomes.

Keywords: rabbit, LTR retrotransposons, genome, chromosome

Procedia PDF Downloads 148
2362 Electroencephalography Correlates of Memorability While Viewing Advertising Content

Authors: Victor N. Anisimov, Igor E. Serov, Ksenia M. Kolkova, Natalia V. Galkina

Abstract:

The problem of memorability of the advertising content is closely connected with the key issues of neuromarketing. The memorability of the advertising content contributes to the marketing effectiveness of the promoted product. Significant directions of studying the phenomenon of memorability are the memorability of the brand (detected through the memorability of the logo) and the memorability of the product offer (detected through the memorization of dynamic audiovisual advertising content - commercial). The aim of this work is to reveal the predictors of memorization of static and dynamic audiovisual stimuli (logos and commercials). An important direction of the research was revealing differences in psychophysiological correlates of memorability between static and dynamic audiovisual stimuli. We assumed that static and dynamic images are perceived in different ways and may have a difference in the memorization process. Objective methods of recording psychophysiological parameters while watching static and dynamic audiovisual materials are well suited to achieve the aim. The electroencephalography (EEG) method was performed with the aim of identifying correlates of the memorability of various stimuli in the electrical activity of the cerebral cortex. All stimuli (in the groups of statics and dynamics separately) were divided into 2 groups – remembered and not remembered based on the results of the questioning method. The questionnaires were filled out by survey participants after viewing the stimuli not immediately, but after a time interval (for detecting stimuli recorded through long-term memorization). Using statistical method, we developed the classifier (statistical model) that predicts which group (remembered or not remembered) stimuli gets, based on psychophysiological perception. The result of the statistical model was compared with the results of the questionnaire. Conclusions: Predictors of the memorability of static and dynamic stimuli have been identified, which allows prediction of which stimuli will have a higher probability of remembering. Further developments of this study will be the creation of stimulus memory model with the possibility of recognizing the stimulus as previously seen or new. Thus, in the process of remembering the stimulus, it is planned to take into account the stimulus recognition factor, which is one of the most important tasks for neuromarketing.

Keywords: memory, commercials, neuromarketing, EEG, branding

Procedia PDF Downloads 251
2361 Reimagine and Redesign: Augmented Reality Digital Technologies and 21st Century Education

Authors: Jasmin Cowin

Abstract:

Augmented reality digital technologies, big data, and the need for a teacher workforce able to meet the demands of a knowledge-based society are poised to lead to major changes in the field of education. This paper explores applications and educational use cases of augmented reality digital technologies for educational organizations during the Fourth Industrial Revolution. The Fourth Industrial Revolution requires vision, flexibility, and innovative educational conduits by governments and educational institutions to remain competitive in a global economy. Educational organizations will need to focus on teaching in and for a digital age to continue offering academic knowledge relevant to 21st-century markets and changing labor force needs. Implementation of contemporary disciplines will need to be embodied through learners’ active knowledge-making experiences while embracing ubiquitous accessibility. The power of distributed ledger technology promises major streamlining for educational record-keeping, degree conferrals, and authenticity guarantees. Augmented reality digital technologies hold the potential to restructure educational philosophies and their underpinning pedagogies thereby transforming modes of delivery. Structural changes in education and governmental planning are already increasing through intelligent systems and big data. Reimagining and redesigning education on a broad scale is required to plan and implement governmental and institutional changes to harness innovative technologies while moving away from the big schooling machine.

Keywords: fourth industrial revolution, artificial intelligence, big data, education, augmented reality digital technologies, distributed ledger technology

Procedia PDF Downloads 277
2360 The Potential Role of Some Nutrients and Drugs in Providing Protection from Neurotoxicity Induced by Aluminium in Rats

Authors: Azza A. Ali, Abeer I. Abd El-Fattah, Shaimaa S. Hussein, Hanan A. Abd El-Samea, Karema Abu-Elfotuh

Abstract:

Background: Aluminium (Al) represents an environmental risk factor. Exposure to high levels of Al causes neurotoxic effects and different diseases. Vinpocetine is widely used to improve cognitive functions, it possesses memory-protective and memory-enhancing properties and has the ability to increase cerebral blood flow and glucose uptake. Cocoa bean represents a rich source of iron as well as a potent antioxidant. It can protect from the impact of free radicals, reduces stress as well as depression and promotes better memory and concentration. Wheatgrass is primarily used as a concentrated source of nutrients. It contains vitamins, minerals, carbohydrates, amino acids and possesses antioxidant and anti-inflammatory activities. Coenzyme Q10 (CoQ10) is an intracellular antioxidant and mitochondrial membrane stabilizer. It is effective in improving cognitive disorders and has been used as anti-aging. Zinc is a structural element of many proteins and signaling messenger that is released by neural activity at many central excitatory synapses. Objective: To study the role of some nutrients and drugs as Vinpocetine, Cocoa, Wheatgrass, CoQ10 and Zinc against neurotoxicity induced by Al in rats as well as to compare between their potency in providing protection. Methods: Seven groups of rats were used and received daily for three weeks AlCl3 (70 mg/kg, IP) for Al-toxicity model groups except for the control group which received saline. All groups of Al-toxicity model except one group (non-treated) were co-administered orally together with AlCl3 the following treatments; Vinpocetine (20mg/kg), Cocoa powder (24mg/kg), Wheat grass (100mg/kg), CoQ10 (200mg/kg) or Zinc (32mg/kg). Biochemical changes in the rat brain as acetyl cholinesterase (ACHE), Aβ, brain derived neurotrophic factor (BDNF), inflammatory mediators (TNF-α, IL-1β), oxidative parameters (MDA, SOD, TAC) were estimated for all groups besides histopathological examinations in different brain regions. Results: Neurotoxicity and neurodegenerations in the rat brain after three weeks of Al exposure were indicated by the significant increase in Aβ, ACHE, MDA, TNF-α, IL-1β, DNA fragmentation together with the significant decrease in SOD, TAC, BDNF and confirmed by the histopathological changes in the brain. On the other hand, co-administration of each of Vinpocetine, Cocoa, Wheatgrass, CoQ10 or Zinc together with AlCl3 provided protection against hazards of neurotoxicity and neurodegenerations induced by Al, their protection were indicated by the decrease in Aβ, ACHE, MDA, TNF-α, IL-1β, DNA fragmentation together with the increase in SOD, TAC, BDNF and confirmed by the histopathological examinations of different brain regions. Vinpocetine and Cocoa showed the most pronounced protection while Zinc provided the least protective effects than the other used nutrients and drugs. Conclusion: Different degrees of protection from neurotoxicity and neuronal degenerations induced by Al could be achieved through the co-administration of some nutrients and drugs during its exposure. Vinpocetine and Cocoa provided the most protection than Wheat grass, CoQ10 or Zinc which showed the least protective effects.

Keywords: aluminum, neurotoxicity, vinpocetine, cocoa, wheat grass, coenzyme Q10, Zinc, rats

Procedia PDF Downloads 249
2359 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions

Authors: Vikrant Gupta, Amrit Goswami

Abstract:

The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.

Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition

Procedia PDF Downloads 136
2358 Autobiographical Memory Functions and Perceived Control in Depressive Symptoms among Young Adults

Authors: Meenu S. Babu, K. Jayasankara Reddy

Abstract:

Depression is a serious mental health concern that leads to significant distress and dysfunction in an individual. Due to the high physical, psychological, social, and economic burden it causes, it is important to study various bio-psycho-social factors that influence the onset, course, duration, intensity of depressive symptoms. The study aims to explore relationship between autobiographical memory (AM) functions, perceived control over stressful events and depressive symptoms. AM functions and perceived control were both found to be protective factors for individuals against depression and were both modifiable to predict better behavioral and affective outcomes. An extensive review of literatur, with a systematic search on Google Scholar, JSTOR, Science Direct and Springer Journals database, was conducted for the purpose of this review paper. These were used for all the aforementioned databases. The time frame used for the search was 2010-2021. An additional search was conducted with no time bar to map the development of the theoretical concepts. The relevant studies with quantitative, qualitative, experimental, and quasi- experimental research designs were included for the review. Studies including a sample with a DSM- 5 or ICD-10 diagnosis of depressive disorders were excluded from the study to focus on the behavioral patterns in a non-clinical population. The synthesis of the findings that were obtained from the review indicates there is a significant relationship between cognitive variables of AM functions and perceived control and depressive symptoms. AM functions were found to be have significant effects on once sense of self, interpersonal relationships, decision making, self- continuity and were related to better emotion regulation and lower depressive symptoms. Not all the components of AM function were equally significant in their relationships with various depressive symptoms. While self and directive functions were more related to emotion regulation, anhedonia, motivation and hence mood and affect, the social function was related to perceived social support and social engagement. Perceived control was found to be another protective cognitive factor that provides individuals a sense of agency and control over one’s life outcomes which was found to be low in individuals with depression. This was also associated to the locus of control, competency beliefs, contingency beliefs and subjective well being in individuals and acted as protective factors against depressive symptoms. AM and perceived control over stressful events serve adaptive functions, hence it is imperative to study these variables more extensively. They can be imperative in planning and implementing therapeutic interventions to foster these cognitive protective factors to mitigate or alleviate depressive symptoms. Exploring AM as a determining factor in depressive symptoms along with perceived control over stress creates a bridge between biological and cognitive factors underlying depression and increases the scope of developing a more eclectic and effective treatment plan for individuals. As culture plays a crucial role in AM functions as well as certain aspects of control such as locus of control, it is necessary to study these variables keeping in mind the cultural context to tailor culture/community specific interventions for depression.

Keywords: autobiographical memories, autobiographical memory functions, perceived control, depressive symptoms, depression, young adults

Procedia PDF Downloads 102
2357 Blockchain: Institutional and Technological Disruptions in the Public Sector

Authors: Maria Florencia Ferrer, Saulo Fabiano Amancio-Vieira

Abstract:

The use of the blockchain in the public sector is present today and no longer the future of disruptive institutional and technological models. There are still some cultural barriers and resistance to the proper use of its potential. This research aims to present the strengths and weaknesses of using a public-permitted and distributed network in the context of the public sector. Therefore, bibliographical/documentary research was conducted to raise the main aspects of the studied platform, focused on the use of the main demands of the public sector. The platform analyzed was LACChain, which is a global alliance composed of different actors in the blockchain environment, led by the Innovation Laboratory of the Inter-American Development Bank Group (IDB Lab) for the development of the blockchain ecosystem in Latin America and the Caribbean. LACChain provides blockchain infrastructure, which is a distributed ratio technology (DLT). The platform focuses on two main pillars: community and infrastructure. It is organized as a consortium for the management and administration of an infrastructure classified as public, following the ISO typologies (ISO / TC 307). It is, therefore, a network open to any participant who agrees with the established rules, which are limited to being identified and complying with the regulations. As benefits can be listed: public network (open to all), decentralized, low transaction cost, greater publicity of transactions, reduction of corruption in contracts / public acts, in addition to improving transparency for the population in general. It is also noteworthy that the platform is not based on cryptocurrency and is not anonymous; that is, it is possible to be regulated. It is concluded that the use of record platforms, such as LACChain, can contribute to greater security on the part of the public agent in the migration process of their informational applications.

Keywords: blockchain, LACChain, public sector, technological disruptions

Procedia PDF Downloads 172
2356 SIP Flooding Attacks Detection and Prevention Using Shannon, Renyi and Tsallis Entropy

Authors: Neda Seyyedi, Reza Berangi

Abstract:

Voice over IP (VOIP) network, also known as Internet telephony, is growing increasingly having occupied a large part of the communications market. With the growth of each technology, the related security issues become of particular importance. Taking advantage of this technology in different environments with numerous features put at our disposal, there arises an increasing need to address the security threats. Being IP-based and playing a signaling role in VOIP networks, Session Initiation Protocol (SIP) lets the invaders use weaknesses of the protocol to disable VOIP service. One of the most important threats is denial of service attack, a branch of which in this article we have discussed as flooding attacks. These attacks make server resources wasted and deprive it from delivering service to authorized users. Distributed denial of service attacks and attacks with a low rate can mislead many attack detection mechanisms. In this paper, we introduce a mechanism which not only detects distributed denial of service attacks and low rate attacks, but can also identify the attackers accurately. We detect and prevent flooding attacks in SIP protocol using Shannon (FDP-S), Renyi (FDP-R) and Tsallis (FDP-T) entropy. We conducted an experiment to compare the percentage of detection and rate of false alarm messages using any of the Shannon, Renyi and Tsallis entropy as a measure of disorder. Implementation results show that, according to the parametric nature of the Renyi and Tsallis entropy, by changing the parameters, different detection percentages and false alarm rates will be gained with the possibility to adjust the sensitivity of the detection mechanism.

Keywords: VOIP networks, flooding attacks, entropy, computer networks

Procedia PDF Downloads 405
2355 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop

Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen

Abstract:

Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.

Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.

Procedia PDF Downloads 41
2354 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad Daba, Jean-Pierre Dubois

Abstract:

Multi path fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper have utilized a Poisson modulated and weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multi-diversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent specular Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: cellular communication, femto and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process

Procedia PDF Downloads 448
2353 Border Security: Implementing the “Memory Effect” Theory in Irregular Migration

Authors: Iliuta Cumpanasu, Veronica Oana Cumpanasu

Abstract:

This paper focuses on studying the conjunction between the new emerged theory of “Memory Effect” in Irregular Migration and Related Criminality and the notion of securitization, and its impact on border management, bringing about a scientific advancement in the field by identifying the patterns corresponding to the linkage of the two concepts, for the first time, and developing a theoretical explanation, with respect to the effects of the non-military threats on border security. Over recent years, irregular migration has experienced a significant increase worldwide. The U.N.'s refugee agency reports that the number of displaced people is at its highest ever - surpassing even post-World War II numbers when the world was struggling to come to terms with the most devastating event in history. This is also the fresh reality within the core studied coordinate, the Balkan Route of Irregular Migration, which starts from Asia and Africa and continues to Turkey, Greece, North Macedonia or Bulgaria, Serbia, and ends in Romania, where thousands of migrants find themselves in an irregular situation concerning their entry to the European Union, with its important consequences concerning the related criminality. The data from the past six years was collected by making use of semi-structured interviews with experts in the field of migration and desk research within some organisations involved in border security, pursuing the gathering of genuine insights from the aforementioned field, which was constantly addressed the existing literature and subsequently subjected to the mixed methods of analysis, including the use of the Vector Auto-Regression estimates model. Thereafter, the analysis of the data followed the processes and outcomes in Grounded Theory, and a new Substantive Theory emerged, explaining how the phenomena of irregular migration and cross-border criminality are the decisive impetus for implementing the concept of securitization in border management by using the proposed pattern. The findings of the study are therefore able to capture an area that has not yet benefitted from a comprehensive approach in the scientific community, such as the seasonality, stationarity, dynamics, predictions, or the pull and push factors in Irregular Migration, also highlighting how the recent ‘Pandemic’ interfered with border security. Therefore, the research uses an inductive revelatory theoretical approach which aims at offering a new theory in order to explain a phenomenon, triggering a practically handy contribution for the scientific community, research institutes or Academia and also usefulness to organizational practitioners in the field, among which UN, IOM, UNHCR, Frontex, Interpol, Europol, or national agencies specialized in border security. The scientific outcomes of this study were validated on June 30, 2021, when the author defended his dissertation for the European Joint Master’s in Strategic Border Management, a two years prestigious program supported by the European Commission and Frontex Agency and a Consortium of six European Universities and is currently one of the research objectives of his pending PhD research at the West University Timisoara.

Keywords: migration, border, security, memory effect

Procedia PDF Downloads 92
2352 A Distributed Smart Battery Management System – sBMS, for Stationary Energy Storage Applications

Authors: António J. Gano, Carmen Rangel

Abstract:

Currently, electric energy storage systems for stationary applications have known an increasing interest, namely with the integration of local renewable energy power sources into energy communities. Li-ion batteries are considered the leading electric storage devices to achieve this integration, and Battery Management Systems (BMS) are decisive for their control and optimum performance. In this work, the advancement of a smart BMS (sBMS) prototype with a modular distributed topology is described. The system, still under development, has a distributed architecture with modular characteristics to operate with different battery pack topologies and charge capacities, integrating adaptive algorithms for functional state real-time monitoring and management of multicellular Li-ion batteries, and is intended for application in the context of a local energy community fed by renewable energy sources. This sBMS system includes different developed hardware units: (1) Cell monitoring units (CMUs) for interfacing with each individual cell or module monitoring within the battery pack; (2) Battery monitoring and switching unit (BMU) for global battery pack monitoring, thermal control and functional operating state switching; (3) Main management and local control unit (MCU) for local sBMS’s management and control, also serving as a communications gateway to external systems and devices. This architecture is fully expandable to battery packs with a large number of cells, or modules, interconnected in series, as the several units have local data acquisition and processing capabilities, communicating over a standard CAN bus and will be able to operate almost autonomously. The CMU units are intended to be used with Li-ion cells but can be used with other cell chemistries, with output voltages within the 2.5 to 5 V range. The different unit’s characteristics and specifications are described, including the different implemented hardware solutions. The developed hardware supports both passive and active methods for charge equalization, considered fundamental functionalities for optimizing the performance and the useful lifetime of a Li-ion battery package. The functional characteristics of the different units of this sBMS system, including different process variables data acquisition using a flexible set of sensors, can support the development of custom algorithms for estimating the parameters defining the functional states of the battery pack (State-of-Charge, State-of-Health, etc.) as well as different charge equalizing strategies and algorithms. This sBMS system is intended to interface with other systems and devices using standard communication protocols, like those used by the Internet of Things. In the future, this sBMS architecture can evolve to a fully decentralized topology, with all the units using Wi-Fi protocols and integrating a mesh network, making unnecessary the MCU unit. The status of the work in progress is reported, leading to conclusions on the system already executed, considering the implemented hardware solution, not only as fully functional advanced and configurable battery management system but also as a platform for developing custom algorithms and optimizing strategies to achieve better performance of electric energy stationary storage devices.

Keywords: Li-ion battery, smart BMS, stationary electric storage, distributed BMS

Procedia PDF Downloads 100
2351 Global Historical Distribution Range of Brown Bear (Ursus Arctos)

Authors: Tariq Mahmood, Faiza Lehrasab, Faraz Akrim, Muhammad Sajid nadeem, Muhammad Mushtaq, Unza waqar, Ayesha Sheraz, Shaista Andleeb

Abstract:

Brown bear (Ursus arctos), a member of the family Ursidae, is distributed in a wide range of habitats in North America, Europe and Asia. Suspectedly, the global distribution range of brown bears is decreasing at the moment due to various factors. The carnivore species is categorized as ‘Least Concern’ globally by the IUCN Red List of Threatened Species. However, there are some fragmented, small populations that are on the verge of extinction, as is in Pakistan, where the species is listed as ‘Critically Endangered’, with a declining population trend. Importantly, the global historical distribution range of brown bears is undocumented. Therefore, in the current study, we reconstructed and estimated the historical distribution range of brown bears using QGIS software and also analyzed the network of protected areas in the past and current ranges of the species. Results showed that brown bear was more widely distributed in historic times, encompassing 52.6 million km² area as compared to their current distribution of 38.8 million km², resulting in a total range contraction of up to approximately 28 %. In the past, a total of N = 62,234 protected Areas, covering approximately 3.89 million km² were present in the distribution range of the species, while now a total of N= 33,313 Protected Areas, covering approximately 2.75 million km² area, are present in the current distribution range of the brown bear. The brown bear distribution range in the protected areas has also contracted by 1.15 million km² and the total percentage reduction of PAs is 29%.

Keywords: brown bear, historic distribution, range contraction, protected areas

Procedia PDF Downloads 60
2350 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills

Authors: Hoorieh Darvishi, Rezaei

Abstract:

Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.

Keywords: autism, communication_skills, repetitive_behaviors, sensory_integration

Procedia PDF Downloads 9
2349 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills

Authors: Hoorieh Darvishi, Rezaei

Abstract:

Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.

Keywords: autism, narrative speech, persian, SI, repetitive behaviors, communication

Procedia PDF Downloads 9
2348 Constructing Digital Memory for Chinese Ancient Village: A Case on Village of Gaoqian

Authors: Linqing Ma, Huiling Feng, Jihong Liang, Yi Qian

Abstract:

In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. Then a repository for the memory of the Village will be completed by doing arrangement and description for those multimedia resources such as texts, photos, videos and so on. Production of Creative products with digital technologies is also possible based a thorough understanding of the culture feature of Gaoqian Village using research tools for literature and history studies and a method of comparative study. Finally, the project will construct an exhibition platform for the Village and its culture by telling its stories with completed structures and treads.

Keywords: ancient villages, digital exhibition, multimedia, traditional culture

Procedia PDF Downloads 587
2347 Acacia mearnsii De Wild-A New Scourge on Cork Oak Forests of El Kala National Park (North-Eastern Algeria)

Authors: Samir Chekchaki, ArifaBeddiar

Abstract:

Nowadays, more and more species are introduced outside their natural range. If most of them remain difficult, some may adopt a much more dynamic behavior. Indeed, we have witnessed in recent decades, the development of high forests of Acacia mearnsii in El Kala National Park. Introduced indefinitely, this leguminous intended to make money (nitrogen supply for industrial plantations of Eucalyptus), became one of the most invasive and more costly in terms of forest management. It has crossed all barriers: it has acclimatized, naturalized and then expanded through diverse landscapes; entry into competition with native species such as cork oak and altered ecosystem functioning. Therefore, it is interesting to analyze this new threat by relying on plants as bio-indicator for assessing biodiversity at different scales. We have identified the species present in several plots distributed in a range of vegetation types subjected to different degrees of disturbance by using the braun-blanquet method. Fifty-six species have been recorded. They are distributed in 48 genera and 29 families. The analysis of the relative frequency of species correlated with relative abundance clearly shows that the Acacia mearnsii feels marginalized. The ecological analysis of this biological invasion shows that disruption of either natural or anthropogenic origin (fire, prolonged drought, cut) represent the factors that exacerbate invasion by opening invasion windows. The lifting of seeds of Acacia mearnsii lasting physical dormancy (and variable) is ensured by the thermal shock in relation to its heliophilous character.

Keywords: Acacia mearnsii De Wild, El Kala National park, fire, invasive, vegetation

Procedia PDF Downloads 357
2346 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 449
2345 Presence and Severity of Language Deficits in Comprehension, Production and Pragmatics in a Group of ALS Patients: Analysis with Demographic and Neuropsychological Data

Authors: M. Testa, L. Peotta, S. Giusiano, B. Lazzolino, U. Manera, A. Canosa, M. Grassano, F. Palumbo, A. Bombaci, S. Cabras, F. Di Pede, L. Solero, E. Matteoni, C. Moglia, A. Calvo, A. Chio

Abstract:

Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease of adulthood, which primarily affects the central nervous system and is characterized by progressive bilateral degeneration of motor neurons. The degeneration processes in ALS extend far beyond the neurons of the motor system, and affects cognition, behaviour and language. To outline the prevalence of language deficits in an ALS cohort and explore their profile along with demographic and neuropsychological data. A full neuropsychological battery and language assessment was administered to 56 ALS patients. Neuropsychological assessment included tests of executive functioning, verbal fluency, social cognition and memory. Language was assessed using tests for verbal comprehension, production and pragmatics. Patients were cognitively classified following the Revised Consensus Criteria and divided in three groups showing different levels of language deficits: group 1 - no language deficit; group 2 - one language deficit; group 3 - two or more language deficits. Chi-square for independence and non-parametric measures to compare groups were applied. Nearly half of ALS-CN patients (48%) reported one language test under the clinical cut-off, and only 13% of patents classified as ALS-CI showed no language deficits, while the rest 87% of ALS-CI reported two or more language deficits. ALS-BI and ALS-CBI cases all reported two or more language deficits. Deficits in production and in comprehension appeared more frequent in ALS-CI patients (p=0.011, p=0.003 respectively), with a higher percentage of comprehension deficits (83%). Nearly all ALS-CI reported at least one deficit in pragmatic abilities (96%) and all ALS-BI and ALS-CBI patients showed pragmatic deficits. Males showed higher percentage of pragmatic deficits (97%, p=0.007). No significant differences in language deficits have been found between bulbar and spinal onset. Months from onset and level of impairment at testing (ALS-FRS total score) were not significantly different between levels and type of language impairment. Age and education were significantly higher for cases showing no deficits in comprehension and pragmatics and in the group showing no language deficits. Comparing performances at neuropsychological tests among the three levels of language deficits, no significant differences in neuropsychological performances were found between group 1 and 2; compared to group 1, group 3 appeared to decay specifically on executive testing, verbal/visuospatial learning, and social cognition. Compared to group 2, group 3 showed worse performances specifically in tests of working memory and attention. Language deficits have found to be spread in our sample, encompassing verbal comprehension, production and pragmatics. Our study reveals that also cognitive intact patients (ALS-CN) showed at least one language deficit in 48% of cases. Pragmatic domain is the most compromised (84% of the total sample), present in nearly all ALS-CI (96%), likely due to the influence of executive impairment. Lower age and higher education seem to preserve comprehension, pragmatics and presence of language deficits. Finally, executive functions, verbal/visuospatial learning and social cognition differentiate the group with no language deficits from the group with a clinical language impairment (group 3), while attention and working memory differentiate the group with one language deficit from the clinical impaired group.

Keywords: amyotrophic lateral sclerosis, language assessment, neuropsychological assessment, language deficit

Procedia PDF Downloads 162
2344 A New Modification of Nonlinear Conjugate Gradient Coefficients with Global Convergence Properties

Authors: Ahmad Alhawarat, Mustafa Mamat, Mohd Rivaie, Ismail Mohd

Abstract:

Conjugate gradient method has been enormously used to solve large scale unconstrained optimization problems due to the number of iteration, memory, CPU time, and convergence property, in this paper we find a new class of nonlinear conjugate gradient coefficient with global convergence properties proved by exact line search. The numerical results for our new βK give a good result when it compared with well-known formulas.

Keywords: conjugate gradient method, conjugate gradient coefficient, global convergence

Procedia PDF Downloads 463
2343 The h3r Antagonist E159 Alleviates Neuroinflammation and Autistic-Like Phenotypes in BTBR T+ tf/J Mouse Model of Autism

Authors: Shilu Deepa Thomas, P. Jayaprakash, Dorota Łazewska, Katarzyna Kieć-Kononowicz, B. Sadek

Abstract:

A large body of evidence suggests the involvement of cognitive impairment, increased levels of inflammation and oxidative stress in the pathogenesis of autism spectrum disorder (ASD). ASD commonly coexists with psychiatric conditions like anxiety and cognitive challenges, and individuals with ASD exhibit significant levels of inflammation and immune system dysregulation. Previous Studies have identified elevated levels of pro-inflammatory markers such as IL-1β, IL-6, IL-2 and TNF-α, particularly in young children with ASD. The current therapeutic options for ASD show limited effectiveness, signifying the importance of exploring an efficient drugs to address the core symptoms. The role of histamine H3 receptors (H3Rs) in memory and the prospective role of H3R antagonists in pharmacological control of neurodegenerative disorders, e.g., ASD, is well-accepted. Hence, the effects of chronic systemic administration of H3R antagonist E159 on autistic-like repetitive behaviors, social deficits, memory and anxiety parameters, as well as neuroinflammation in Black and Tan BRachyury (BTBR) mice, were evaluated using Y maze, Barnes maze, self-grooming, open field and three chamber social test. E159 (2.5, 5 and 10 mg/kg, i.p.) dose-dependently ameliorated repetitive and compulsive behaviors by reducing the increased time spent in self-grooming and improved reduced spontaneous alternation in BTBR mice. Moreover, treatment with E159 attenuated disturbed anxiety levels and social deficits in tested male BTBR mice. Furthermore, E159 attenuated oxidative stress by significantly increasing GSH, CAT, and SOD and decreasing the increased levels of MDA in the cerebellum as well as the hippocampus. In addition, E159 decreased the elevated levels of proinflammatory cytokines (tumor necrosis factor (TNF-α), interleukin-1β (IL-1β), and IL-6). The observed results show that H3R antagonists like E159 may represent a promising novel pharmacological strategy for the future treatment of ASD.

Keywords: histamine H3 receptors, antagonist E159, autism, behaviors, mice

Procedia PDF Downloads 64
2342 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 164
2341 Antibacterial Wound Dressing Based on Metal Nanoparticles Containing Cellulose Nanofibers

Authors: Mohamed Gouda

Abstract:

Antibacterial wound dressings based on cellulose nanofibers containing different metal nanoparticles (CMC-MNPs) were synthesized using an electrospinning technique. First, the composite of carboxymethyl cellulose containing different metal nanoparticles (CMC/MNPs), such as copper nanoparticles (CuNPs), iron nanoparticles (FeNPs), zinc nanoparticles (ZnNPs), cadmium nanoparticles (CdNPs) and cobalt nanoparticles (CoNPs) were synthesized, and finally, these composites were transferred to the electrospinning process. Synthesized CMC-MNPs were characterized using scanning electron microscopy (SEM) coupled with high-energy dispersive X-ray (EDX) and UV-visible spectroscopy used to confirm nanoparticle formation. The SEM images clearly showed regular flat shapes with semi-porous surfaces. All MNPs were well distributed inside the backbone of the cellulose without aggregation. The average particle diameters were 29-39 nm for ZnNPs, 29-33 nm for CdNPs, 25-33 nm for CoNPs, 23-27 nm for CuNPs and 22-26 nm for FeNPs. Surface morphology, water uptake and release of MNPs from the nanofibers in water and antimicrobial efficacy were studied. SEM images revealed that electrospun CMC-MNPs nanofibers are smooth and uniformly distributed without bead formation with average fiber diameters in the range of 300 to 450 nm. Fiber diameters were not affected by the presence of MNPs. TEM images showed that MNPs are present in/on the electrospun CMC-MNPs nanofibers. The diameter of the electrospun nanofibers containing MNPs was in the range of 300–450 nm. The MNPs were observed to be spherical in shape. The CMC-MNPs nanofibers showed good hydrophilic properties and had excellent antibacterial activity against the Gram-negative bacteria Escherichia coli and the Gram-positive bacteria Staphylococcus aureus.

Keywords: electrospinning technique, metal nanoparticles, cellulosic nanofibers, wound dressing

Procedia PDF Downloads 329
2340 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 112
2339 High Performance Computing Enhancement of Agent-Based Economic Models

Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna

Abstract:

This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).

Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process

Procedia PDF Downloads 127
2338 Using the Dokeos Platform for Industrial E-Learning Solution

Authors: Kherafa Abdennasser

Abstract:

The application of Information and Communication Technologies (ICT) to the training area led to the creation of this new reality called E-learning. That last one is described like the marriage of multi- media (sound, image and text) and of the internet (diffusion on line, interactivity). Distance learning became an important totality for training and that last pass in particular by the setup of a distance learning platform. In our memory, we will use an open source platform named Dokeos for the management of a distance training of GPS called e-GPS. The learner is followed in all his training. In this system, trainers and learners communicate individually or in group, the administrator setup and make sure of this system maintenance.

Keywords: ICT, E-learning, learning plate-forme, Dokeos, GPS

Procedia PDF Downloads 477