Search results for: fast Fourier algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4619

Search results for: fast Fourier algorithms

3539 Harnessing the Power of Artificial Intelligence: Advancements and Ethical Considerations in Psychological and Behavioral Sciences

Authors: Nayer Mofidtabatabaei

Abstract:

Advancements in artificial intelligence (AI) have transformed various fields, including psychology and behavioral sciences. This paper explores the diverse ways in which AI is applied to enhance research, diagnosis, therapy, and understanding of human behavior and mental health. We discuss the potential benefits and challenges associated with AI in these fields, emphasizing the ethical considerations and the need for collaboration between AI researchers and psychological and behavioral science experts. Artificial Intelligence (AI) has gained prominence in recent years, revolutionizing multiple industries, including healthcare, finance, and entertainment. One area where AI holds significant promise is the field of psychology and behavioral sciences. AI applications in this domain range from improving the accuracy of diagnosis and treatment to understanding complex human behavior patterns. This paper aims to provide an overview of the various AI applications in psychological and behavioral sciences, highlighting their potential impact, challenges, and ethical considerations. Mental Health Diagnosis AI-driven tools, such as natural language processing and sentiment analysis, can analyze large datasets of text and speech to detect signs of mental health issues. For example, chatbots and virtual therapists can provide initial assessments and support to individuals suffering from anxiety or depression. Autism Spectrum Disorder (ASD) Diagnosis AI algorithms can assist in early ASD diagnosis by analyzing video and audio recordings of children's behavior. These tools help identify subtle behavioral markers, enabling earlier intervention and treatment. Personalized Therapy AI-based therapy platforms use personalized algorithms to adapt therapeutic interventions based on an individual's progress and needs. These platforms can provide continuous support and resources for patients, making therapy more accessible and effective. Virtual Reality Therapy Virtual reality (VR) combined with AI can create immersive therapeutic environments for treating phobias, PTSD, and social anxiety. AI algorithms can adapt VR scenarios in real-time to suit the patient's progress and comfort level. Data Analysis AI aids researchers in processing vast amounts of data, including survey responses, brain imaging, and genetic information. Privacy Concerns Collecting and analyzing personal data for AI applications in psychology and behavioral sciences raise significant privacy concerns. Researchers must ensure the ethical use and protection of sensitive information. Bias and Fairness AI algorithms can inherit biases present in training data, potentially leading to biased assessments or recommendations. Efforts to mitigate bias and ensure fairness in AI applications are crucial. Transparency and Accountability AI-driven decisions in psychology and behavioral sciences should be transparent and subject to accountability. Patients and practitioners should understand how AI algorithms operate and make decisions. AI applications in psychological and behavioral sciences have the potential to transform the field by enhancing diagnosis, therapy, and research. However, these advancements come with ethical challenges that require careful consideration. Collaboration between AI researchers and psychological and behavioral science experts is essential to harness AI's full potential while upholding ethical standards and privacy protections. The future of AI in psychology and behavioral sciences holds great promise, but it must be navigated with caution and responsibility.

Keywords: artificial intelligence, psychological sciences, behavioral sciences, diagnosis and therapy, ethical considerations

Procedia PDF Downloads 73
3538 Synthesis and Characterization of Magnesium and Strontium Doped Sulphate-Hydroxyapatite

Authors: Ammar Z. Alshemary, Yi-Fan Goh, Rafaqat Hussain

Abstract:

Magnesium (Mg2+), strontium (Sr2+) and sulphate ions (SO42-) were successfully substituted into hydroxyapatite (Ca10-x-y MgxSry(PO4)6-z(SO4)zOH2-z) structure through ion exchange process at cationic and anionic sites. Mg2+and Sr2+ ions concentrations were varied between (0.00-0.10), keeping concentration of SO42- ions at z=0.05. [Mg (NO3)2], [Sr (NO3)2] and (Na2SO4) were used as Mg2+, Sr2+, and SO42- sources respectively. The synthesized white precipitate were subjected to heat treatment at 500ºC and finally characterized by X-ray diffraction (XRD) and Fourier Transform infra-red spectroscopy (FTIR). The results showed that the substitution of Mg2+, Sr2+ and SO42- ions into the HA lattice resulted in an increase in the broadness and reduction of XRD peaks. This confirmed that the crystallinity was reduced due to the substitution of ions. Similarly, FTIR result showed the effect of substitution on phosphate bands as well as exchange of hydroxyl group by SO42- ions to balance the charges on HA surface.

Keywords: hydroxyapatite, substitution, characterization, XRD, FTIR

Procedia PDF Downloads 444
3537 Muscle Neurotrophins Family Response to Resistance Exercise

Authors: Rasoul Eslami, Reza Gharakhanlou

Abstract:

NT-4/5 and TrkB have been proposed to be involved in the coordinated adaptations of the neuromuscular system to elevated level of activity. Despite the persistence of this neurotrophin and its receptor expression in adult skeletal muscle, little attention has been paid to the functional significance of this complex in the mature neuromuscular system. Therefore, the purpose of this research was to study the effect of one session of resistance exercise on mRNA expression of NT4/5 and TrkB proteins in slow and fast muscles of Wistar Rats. Male Wistar rats (10 mo of age, preparation of Pasteur Institute) were housed under similar living conditions in cages (in groups of four) at room temperature under a controlled light/dark (12-h) cycle with ad libitum access to food and water. A number of sixteen rats were randomly divided to two groups (resistance exercise (T) and control (C); n=8 for each group). The resistance training protocol consisted of climbing a 1-meter–long ladder, with a weight attached to a tail sleeve. Twenty-four hours following the main training session, rats of T and C groups were anaesthetized and the right soleus and flexor hallucis longus (FHL) muscles were removed under sterile conditions via an incision on the dorsolateral aspect of the hind limb. For NT-4/5 and TrkB expression, quantitative real time RT-PCR was used. SPSS software and independent-samples t-test were used for data analysis. The level of significance was set at P < 0.05. Data indicate that resistance training significantly (P<0.05) decreased mRNA expression of NT4/5 in soleus muscle. However, no significant alteration was detected in FHL muscle (P>0.05). Our results also indicate that no significant alterations were detected for TrkB mRNA expression in soleus and FHL muscles (P>0.05). Decrease in mRNA expression of NT4/5 in soleus muscle may be as result of post-translation regulation following resistance training. Also, non-alteration in TrkB mRNA expression was indicated in probable roll of P75 receptor.

Keywords: neurotrophin-4/5 (NT-4/5), TrkB receptor, resistance training, slow and fast muscles

Procedia PDF Downloads 446
3536 Customer Churn Prediction by Using Four Machine Learning Algorithms Integrating Features Selection and Normalization in the Telecom Sector

Authors: Alanoud Moraya Aldalan, Abdulaziz Almaleh

Abstract:

A crucial component of maintaining a customer-oriented business as in the telecom industry is understanding the reasons and factors that lead to customer churn. Competition between telecom companies has greatly increased in recent years. It has become more important to understand customers’ needs in this strong market of telecom industries, especially for those who are looking to turn over their service providers. So, predictive churn is now a mandatory requirement for retaining those customers. Machine learning can be utilized to accomplish this. Churn Prediction has become a very important topic in terms of machine learning classification in the telecommunications industry. Understanding the factors of customer churn and how they behave is very important to building an effective churn prediction model. This paper aims to predict churn and identify factors of customers’ churn based on their past service usage history. Aiming at this objective, the study makes use of feature selection, normalization, and feature engineering. Then, this study compared the performance of four different machine learning algorithms on the Orange dataset: Logistic Regression, Random Forest, Decision Tree, and Gradient Boosting. Evaluation of the performance was conducted by using the F1 score and ROC-AUC. Comparing the results of this study with existing models has proven to produce better results. The results showed the Gradients Boosting with feature selection technique outperformed in this study by achieving a 99% F1-score and 99% AUC, and all other experiments achieved good results as well.

Keywords: machine learning, gradient boosting, logistic regression, churn, random forest, decision tree, ROC, AUC, F1-score

Procedia PDF Downloads 134
3535 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties

Authors: G. Martino, F. Silva, E. Marchal

Abstract:

The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.

Keywords: clusterization and classification algorithms, integrated planning, mathematical modeling, optimization, penalty minimization

Procedia PDF Downloads 124
3534 Eco-Friendly Natural Filler Based Epoxy Composites

Authors: Suheyla Kocaman, Gulnare Ahmetli

Abstract:

In this study, acrylated soybean oil (AESO) was used as modifying agent for DGEBF-type epoxy resin (ER). AESO was used as a co-matrix in 50 wt % with ER. Composites with eco-friendly natural fillers-banana bark and seashell were prepared. MNA was used as a hardener. Effect of banana peel (BP) and seashell (SSh) fillers on mechanical properties, such as tensile strength, elongation at break, and hardness of M-ERs were investigated. The structure epoxy resins (M-ERs) cured with MNA and sebacic acid (SAc) hardeners were characterized by Fourier transform infrared spectroscopy (FTIR). Tensile test results show that Young’s (elastic) modulus, tensile strength and hardness of SSh particles reinforced with M-ERs were higher than the M-ERs reinforced with banana bark.

Keywords: biobased composite, epoxy resin, mechanical properties, natural fillers

Procedia PDF Downloads 242
3533 State of Freelancing in IT and Future Trends

Authors: Mihai Gheorghe

Abstract:

Freelancing in IT has seen an increased popularity during the last years mainly because of the fast Internet adoption in the countries with emerging economies, correlated with the continuous seek for reduced development costs as well with the rise of online platforms which address planning, coordination, and various development tasks. This paper conducts an overview of the most relevant Freelance Marketplaces available and studies the market structure, distribution of the workforce and trends in IT freelancing.

Keywords: freelancing in IT, freelance marketplaces, freelance market structure, globalization, online staffing, trends in freelancing

Procedia PDF Downloads 207
3532 Synthesis of Epoxidized Castor Oil Using a Sulphonated Polystyrene Type Cation Exchange Resin and Its Blend Preparation with Epoxy Resin

Authors: G. S. Sudha, Smita Mohanty, S. K. Nayak

Abstract:

Epoxidized oils can replace petroleum derived materials in numerous industrial applications, because of their respectable oxirane oxygen content and high reactivity of oxirane ring. Epoxidized castor oil (ECO) has synthesized in the presence of a sulphonated polystyrene type cation exchange resin. The formation of the oxirane ring was confirmed by Fourier Transform Infrared Spectroscopy (FTIR) analysis. The epoxidation reaction was evaluated by Nuclear Magnetic Resonance (NMR) studies. ECO is used as a toughening phase to increase the toughness of petroleum-based epoxy resin.

Keywords: epoxy resin, epoxidized castor oil, sulphonated polystyrene type cation exchange resin, petroleum derived materials

Procedia PDF Downloads 475
3531 Peril´s Environment of Energetic Infrastructure Complex System, Modelling by the Crisis Situation Algorithms

Authors: Jiří F. Urbánek, Alena Oulehlová, Hana Malachová, Jiří J. Urbánek Jr.

Abstract:

Crisis situations investigation and modelling are introduced and made within the complex system of energetic critical infrastructure, operating on peril´s environments. Every crisis situations and perils has an origin in the emergency/ crisis event occurrence and they need critical/ crisis interfaces assessment. Here, the emergency events can be expected - then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping; or it may be unexpected - without pre-prepared scenario of event. But the both need operational coping by means of crisis management as well. The operation, forms, characteristics, behaviour and utilization of crisis management have various qualities, depending on real critical infrastructure organization perils, and prevention training processes. An aim is always - better security and continuity of the organization, which successful obtainment needs to find and investigate critical/ crisis zones and functions in critical infrastructure organization models, operating in pertinent perils environment. Our DYVELOP (Dynamic Vector Logistics of Processes) method is disposables for it. Here, it is necessary to derive and create identification algorithm of critical/ crisis interfaces. The locations of critical/ crisis interfaces are the flags of crisis situation in organization of critical infrastructure models. Then, the model of crisis situation will be displayed at real organization of Czech energetic crisis infrastructure subject in real peril environment. These efficient measures are necessary for the infrastructure protection. They will be derived for peril mitigation, crisis situation coping and for environmentally friendly organization survival, continuity and its sustainable development advanced possibilities.

Keywords: algorithms, energetic infrastructure complex system, modelling, peril´s environment

Procedia PDF Downloads 403
3530 Finding a Set of Long Common Substrings with Repeats from m Input Strings

Authors: Tiantian Li, Lusheng Wang, Zhaohui Zhan, Daming Zhu

Abstract:

In this paper, we propose two string problems, and study algorithms and complexity of various versions for those problems. Let S = {s₁, s₂, . . . , sₘ} be a set of m strings. A common substring of S is a substring appearing in every string in S. Given a set of m strings S = {s₁, s₂, . . . , sₘ} and a positive integer k, we want to find a set C of k common substrings of S such that the k common substrings in C appear in the same order and have no overlap among the m input strings in S, and the total length of the k common substring in C is maximized. This problem is referred to as the longest total length of k common substrings from m input strings (LCSS(k, m) for short). The other problem we study here is called the longest total length of a set of common substrings with length more than l from m input string (LSCSS(l, m) for short). Given a set of m strings S = {s₁, s₂, . . . , sₘ} and a positive integer l, for LSCSS(l, m), we want to find a set of common substrings of S, each is of length more than l, such that the total length of all the common substrings is maximized. We show that both problems are NP-hard when k and m are variables. We propose dynamic programming algorithms with time complexity O(k n₁n₂) and O(n₁n₂) to solve LCSS(k, 2) and LSCSS(l, 2), respectively, where n1 and n₂ are the lengths of the two input strings. We then design an algorithm for LSCSS(l, m) when every length > l common substring appears once in each of the m − 1 input strings. The running time is O(n₁²m), where n1 is the length of the input string with no restriction on length > l common substrings. Finally, we propose a fixed parameter algorithm for LSCSS(l, m), where each length > l common substring appears m − 1 + c times among the m − 1 input strings (other than s1). In other words, each length > l common substring may repeatedly appear at most c times among the m − 1 input strings {s₂, s₃, . . . , sₘ}. The running time of the proposed algorithm is O((n12ᶜ)²m), where n₁ is the input string with no restriction on repeats. The LSCSS(l, m) is proposed to handle whole chromosome sequence alignment for different strains of the same species, where more than 98% of letters in core regions are identical.

Keywords: dynamic programming, algorithm, common substrings, string

Procedia PDF Downloads 22
3529 The Determination of the Zinc Sulfate, Sodium Hydroxide and Boric Acid Molar Ratio on the Production of Zinc Borates

Authors: N. Tugrul, A. S. Kipcak, E. Moroydor Derun, S. Piskin

Abstract:

Zinc borate is an important boron compound that can be used as multi-functional flame retardant additive due to its high dehydration temperature property. In this study, the raw materials of ZnSO4.7H2O, NaOH and H3BO3 were characterized by X-Ray Diffraction (XRD) and Fourier Transform Infrared Spectroscopy (FT-IR) and used in the synthesis of zinc borates. The synthesis parameters were set to 100°C reaction temperature and 120 minutes of reaction time, with different molar ratio of starting materials (ZnSO4.7H2O:NaOH:H3BO3). After the zinc borate synthesis, the identifications of the products were conducted by XRD and FT-IR. As a result, Zinc Oxide Borate Hydrate [Zn3B6O12.3.5H2O], were synthesized at the molar ratios of 1:1:3, 1:1:4, 1:2:5 and 1:2:6. Among these ratios 1:2:6 had the best results.

Keywords: Zinc borate, ZnSO4.7H2O, NaOH, H3BO3, XRD, FT-IR

Procedia PDF Downloads 361
3528 Numerical Iteration Method to Find New Formulas for Nonlinear Equations

Authors: Kholod Mohammad Abualnaja

Abstract:

A new algorithm is presented to find some new iterative methods for solving nonlinear equations F(x)=0 by using the variational iteration method. The efficiency of the considered method is illustrated by example. The results show that the proposed iteration technique, without linearization or small perturbation, is very effective and convenient.

Keywords: variational iteration method, nonlinear equations, Lagrange multiplier, algorithms

Procedia PDF Downloads 545
3527 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data

Authors: S. Jurado, E. Pazmino

Abstract:

Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.

Keywords: medial axis, pore-throat distribution, porosity, porous media

Procedia PDF Downloads 116
3526 Synthesis and Application of an Organic Dye in Nanostructure Solar Cells Device

Authors: M. Hoseinnezhad, K. Gharanjig

Abstract:

Two organic dyes comprising carbazole as the electron donors and cyanoacetic acid moieties as the electron acceptors were synthesized. The organic dye was prepared by standard reaction from carbazole as the starting material. To this end, carbazole was reacted with bromobenzene and further oxidation and reacted with cyanoacetic acid. The obtained organic dye was purified and characterized using differential scanning calorimetry (DSC), Fourier transform infrared spectroscopy (FT-IR), proton nuclear magnetic resonance (1HNMR), carbon nuclear magnetic resonance (13CNMR) and elemental analysis. The influence of heteroatom on carbazole donors and cyno substitution on the acid acceptor is evidenced by spectral and electrochemical photovoltaic experiments. Finally, light fastness properties for organic dye were investigated.

Keywords: dye-sensitized solar cells, indoline dye, nanostructure, oxidation potential, solar energy

Procedia PDF Downloads 195
3525 A Graph Library Development Based on the Service-‎Oriented Architecture: Used for Representation of the ‎Biological ‎Systems in the Computer Algorithms

Authors: Mehrshad Khosraviani, Sepehr Najjarpour

Abstract:

Considering the usage of graph-based approaches in systems and synthetic biology, and the various types of ‎the graphs employed by them, a comprehensive graph library based ‎on the three-tier architecture (3TA) was previously introduced for full representation of the biological systems. Although proposing a 3TA-based graph library, three following reasons motivated us to redesign the graph ‎library based on the service-oriented architecture (SOA): (1) Maintaining the accuracy of the data related to an input graph (including its edges, its ‎vertices, its topology, etc.) without involving the end user:‎ Since, in the case of using 3TA, the library files are available to the end users, they may ‎be utilized incorrectly, and consequently, the invalid graph data will be provided to the ‎computer algorithms. However, considering the usage of the SOA, the operation of the ‎graph registration is specified as a service by encapsulation of the library files. In other words, overall control operations needed for registration of the valid data will be the ‎responsibility of the services. (2) Partitioning of the library product into some different parts: Considering 3TA, a whole library product was provided in general. While here, the product ‎can be divided into smaller ones, such as an AND/OR graph drawing service, and each ‎one can be provided individually. As a result, the end user will be able to select any ‎parts of the library product, instead of all features, to add it to a project. (3) Reduction of the complexities: While using 3TA, several other libraries must be needed to add for connecting to the ‎database, responsibility of the provision of the needed library resources in the SOA-‎based graph library is entrusted with the services by themselves. Therefore, the end user ‎who wants to use the graph library is not involved with its complexity. In the end, in order to ‎make ‎the library easier to control in the system, and to restrict the end user from accessing the files, ‎it was preferred to use the service-oriented ‎architecture ‎‎(SOA) over the three-tier architecture (3TA) and to redevelop the previously proposed graph library based on it‎.

Keywords: Bio-Design Automation, Biological System, Graph Library, Service-Oriented Architecture, Systems and Synthetic Biology

Procedia PDF Downloads 311
3524 Antibacterial Activity of Nickel Oxide Composite Films with Chitosan/Polyvinyl Chloride/Polyethylene Glycol

Authors: Ali Garba Danjani, Abdulrasheed Halliru Usman

Abstract:

Due to the rapidly increasing biological applications and antibacterial properties of versatile chitosan composites, the effects of chitosan/polyvinyl chloride composites film were investigated. Chitosan/polyvinyl chloride films were prepared by a casting method. Polyethylene glycol (PEG) was used as a plasticizer in the blending stage of film preparation. Characterizations of films were done by Scanning Electron microscopy (SEM), Fourier transforms infrared spectroscopy (FTIR), and thermogravimetric analyzer (TGA). Chitosan composites incorporation enhanced the antibacterial activity of chitosan films against Escherichia coli and Staphylococcus aureus. The composite film produced is proposed as packaging or coating material because of its flexibility, antibacterial efficacy, and good mechanical strength.

Keywords: chitosan, polymeric nanocomposites, antibacterial activity, polymer blend

Procedia PDF Downloads 100
3523 Thermal Stability and Insulation of a Cement Mixture Using Graphene Oxide Nanosheets

Authors: Nasser A. M. Habib

Abstract:

The impressive physical properties of graphene derivatives, including thermal properties, have made them an attractive addition to advanced construction nanomaterial. In this study, we investigated the impact of incorporating low amounts of graphene oxide (GO) into cement mixture nanocomposites on their heat storage and thermal stability. The composites were analyzed using Fourier transmission infrared, thermo-gravimetric analysis, and field emission scanning electron microscopy. Results showed that GO significantly improved specific heat by 32%, reduced thermal conductivity by 16%, and reduced thermal decomposition to only 3% at a concentration of 1.2 wt%. These findings suggest that the cement mixture can withstand high temperatures and may suit specific applications requiring thermal stability and insulation properties.

Keywords: cement mixture composite, graphene oxide, thermal decomposition, thermal conductivity

Procedia PDF Downloads 70
3522 Machine Learning in Patent Law: How Genetic Breeding Algorithms Challenge Modern Patent Law Regimes

Authors: Stefan Papastefanou

Abstract:

Artificial intelligence (AI) is an interdisciplinary field of computer science with the aim of creating intelligent machine behavior. Early approaches to AI have been configured to operate in very constrained environments where the behavior of the AI system was previously determined by formal rules. Knowledge was presented as a set of rules that allowed the AI system to determine the results for specific problems; as a structure of if-else rules that could be traversed to find a solution to a particular problem or question. However, such rule-based systems typically have not been able to generalize beyond the knowledge provided. All over the world and especially in IT-heavy industries such as the United States, the European Union, Singapore, and China, machine learning has developed to be an immense asset, and its applications are becoming more and more significant. It has to be examined how such products of machine learning models can and should be protected by IP law and for the purpose of this paper patent law specifically, since it is the IP law regime closest to technical inventions and computing methods in technical applications. Genetic breeding models are currently less popular than recursive neural network method and deep learning, but this approach can be more easily described by referring to the evolution of natural organisms, and with increasing computational power; the genetic breeding method as a subset of the evolutionary algorithms models is expected to be regaining popularity. The research method focuses on patentability (according to the world’s most significant patent law regimes such as China, Singapore, the European Union, and the United States) of AI inventions and machine learning. Questions of the technical nature of the problem to be solved, the inventive step as such, and the question of the state of the art and the associated obviousness of the solution arise in the current patenting processes. Most importantly, and the key focus of this paper is the problem of patenting inventions that themselves are developed through machine learning. The inventor of a patent application must be a natural person or a group of persons according to the current legal situation in most patent law regimes. In order to be considered an 'inventor', a person must actually have developed part of the inventive concept. The mere application of machine learning or an AI algorithm to a particular problem should not be construed as the algorithm that contributes to a part of the inventive concept. However, when machine learning or the AI algorithm has contributed to a part of the inventive concept, there is currently a lack of clarity regarding the ownership of artificially created inventions. Since not only all European patent law regimes but also the Chinese and Singaporean patent law approaches include identical terms, this paper ultimately offers a comparative analysis of the most relevant patent law regimes.

Keywords: algorithms, inventor, genetic breeding models, machine learning, patentability

Procedia PDF Downloads 109
3521 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning

Authors: Saahith M. S., Sivakami R.

Abstract:

In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.

Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis

Procedia PDF Downloads 39
3520 A Molding Surface Auto-inspection System

Authors: Ssu-Han Chen, Der-Baau Perng

Abstract:

Molding process in IC manufacturing secures chips against the harms done by hot, moisture or other external forces. While a chip was being molded, defects like cracks, dilapidation, or voids may be embedding on the molding surface. The molding surfaces the study poises to treat and the ones on the market, though, differ in the surface where texture similar to defects is everywhere. Manual inspection usually passes over low-contrast cracks or voids; hence an automatic optical inspection system for molding surface is necessary. The proposed system is consisted of a CCD, a coaxial light, a back light as well as a motion control unit. Based on the property of statistical textures of the molding surface, a series of digital image processing and classification procedure is carried out. After training of the parameter associated with above algorithm, result of the experiment suggests that the accuracy rate is up to 93.75%, contributing to the inspection quality of IC molding surface.

Keywords: molding surface, machine vision, statistical texture, discrete Fourier transformation

Procedia PDF Downloads 432
3519 Reshaping of Indian Education System with the Help of Multi-Media: Promises and Pitfalls

Authors: Geetu Gahlawat

Abstract:

The education system accustomed information on daily basis in term of variety i.e Multimedia channel. This can create a challenge to pedagogue to get hold on learner. Multimedia enhance the education system with its technology. Educators deliver their content effectively and beyond any limit through multimedia elements on another side it gives easy learning to learners and they are able to get their goals fast. This paper gives an overview of how multimedia reshape the Indian education system with its promises and pitfalls.

Keywords: multimedia, technology, techniques, development, pedagogy

Procedia PDF Downloads 282
3518 Numerical Experiments for the Purpose of Studying Space-Time Evolution of Various Forms of Pulse Signals in the Collisional Cold Plasma

Authors: N. Kh. Gomidze, I. N. Jabnidze, K. A. Makharadze

Abstract:

The influence of inhomogeneities of plasma and statistical characteristics on the propagation of signal is very actual in wireless communication systems. While propagating in the media, the deformation and evaluation of the signal in time and space take place and on the receiver we get a deformed signal. The present article is dedicated to studying the space-time evolution of rectangular, sinusoidal, exponential and bi-exponential impulses via numerical experiment in the collisional, cold plasma. The presented method is not based on the Fourier-presentation of the signal. Analytically, we have received the general image depicting the space-time evolution of the radio impulse amplitude that gives an opportunity to analyze the concrete results in the case of primary impulse.

Keywords: collisional, cold plasma, rectangular pulse signal, impulse envelope

Procedia PDF Downloads 384
3517 Graphene-Oxide-Supported Coal-Layered Double Hydroxides: Synthesis and Characterizations

Authors: Shaeel A. Al Thabaiti, Sulaiman N. Basahel, Salem M. Bawaked, Mohamed Mokhtar

Abstract:

Nanosheets for cobalt-layered double hydroxide (Co-Al-LDH)/GO were successfully synthesized with different Co:M g:Al ratios (0:3:1, 1.5:1.5:1, and 3:0:1). The layered double hydroxide structure and morphology were determined using x-ray diffraction (XRD), Fourier transform infrared spectroscopy (FTIR), and scanning electron microscopy (SEM). Temperature prgrammed reduction (TPR) of Co-Al-LDH showed reduction peaks at lower temperature which indicates the ease reducibility of this particular sample. The thermal behaviour was studied using thermal graviemetric technique (TG), and the BET-surface area was determined using N2 physisorption at -196°C. The C-C coupling reaction was carried out over all the investigated catalysts. The Mg–Al LDH catalyst without Co ions is inactive, but the isomorphic substitution of Mg by Co ions (Co:Mg:Al = 1.5:1.5:1) in the cationic sheet resulted in 88% conversion of iodobenzene under reflux. LDH/GO hybrid is up to 2 times higher activity than for the unsupported LDH.

Keywords: adsorption, co-precipitation, graphene oxide, layer double hydroxide

Procedia PDF Downloads 301
3516 The Optimization of Copper Sulfate and Tincalconite Molar Ratios on the Hydrothermal Synthesis of Copper Borates

Authors: E. Moroydor Derun, N. Tugrul, F. T. Senberber, A. S. Kipcak, S. Piskin

Abstract:

In this research, copper borates are synthesized by the reaction of copper sulfate pentahydrate (CuSO4.5H2O) and tincalconite (Na2O4B7.10H2O). The experimental parameters are selected as 80°C reaction temperature and 60 of reaction time. The effect of mole ratio of CuSO4.5H2O to Na2O4B7.5H2O is studied. For the identification analyses X-Ray Diffraction (XRD) and Fourier Transform Infrared Spectroscopy (FT-IR) techniques are used. At the end of the experiments, synthesized copper borate is matched with the powder diffraction file of “00-001-0472” [Cu(BO2)2] and characteristic vibrations between B and O atoms are seen. The proper crystals are obtained at the mole ratio of 3:1. This study showed that simplified synthesis process is suitable for the production of copper borate minerals.

Keywords: hydrothermal synthesis, copper borates, copper sulfate, tincalconite

Procedia PDF Downloads 381
3515 Cd2+ Ions Removal from Aqueous Solutions Using Alginite

Authors: Vladimír Frišták, Martin Pipíška, Juraj Lesný

Abstract:

Alginate has been evaluated as an efficient pollution control material. In this paper, alginate from maar Pinciná (SR) for removal of Cd2+ ions from aqueous solution was studied. The potential sorbent was characterized by X-Ray Fluorescence Analysis (RFA) analysis, Fourier Transform Infrared Spectral Analysis (FT-IR) and Specific Surface Area (SSA) was also determined. The sorption process was optimized from the point of initial cadmium concentration effect and effect of pH value. The Freundlich and Langmuir models were used to interpret the sorption behaviour of Cd2+ ions, and the results showed that experimental data were well fitted by the Langmuir equation. Alginate maximal sorption capacity (QMAX) for Cd2+ ions calculated from Langmuir isotherm was 34 mg/g. Sorption process was significantly affected by initial pH value in the range from 4.0-7.0. Alginate is a comparable sorbent with other materials for toxic metals removal.

Keywords: alginates, Cd2+, sorption, QMAX

Procedia PDF Downloads 359
3514 Functionalization of Single-Walled Nanotubes by Synthesied Pigments

Authors: Shahab Zomorodbakhsh, Hayron Nesa Motevasel

Abstract:

Water soluble compoundes were attached to single-walled carbon nanotubes (SWNTs) to form water-soluble nano pigments. functionalized SWNTs were then characterized by Fourier Transform Infrared spectroscopy (FT-IR), Raman spectroscopy, UV analysis, Transmission electron microscopy (TEM)and defunctionalization test and Representative results concerning the solubility. The product can be dissolved in water and High-resolution transmission electron microscope images showed that the SWNTs were efficiently functionalized, thus the p-stacking interaction between aromatic rings and COOH of SWNTs was considered responsible for the high solubility and High transmission electron in singlewall nanotubes.

Keywords: functionalized CNTs, singlewalled carbon nanotubes, water soluble compoundes, nano pigments

Procedia PDF Downloads 321
3513 An Energy-Balanced Clustering Method on Wireless Sensor Networks

Authors: Yu-Ting Tsai, Chiun-Chieh Hsu, Yu-Chun Chu

Abstract:

In recent years, due to the development of wireless network technology, many researchers have devoted to the study of wireless sensor networks. The applications of wireless sensor network mainly use the sensor nodes to collect the required information, and send the information back to the users. Since the sensed area is difficult to reach, there are many restrictions on the design of the sensor nodes, where the most important restriction is the limited energy of sensor nodes. Because of the limited energy, researchers proposed a number of ways to reduce energy consumption and balance the load of sensor nodes in order to increase the network lifetime. In this paper, we proposed the Energy-Balanced Clustering method with Auxiliary Members on Wireless Sensor Networks(EBCAM)based on the cluster routing. The main purpose is to balance the energy consumption on the sensed area and average the distribution of dead nodes in order to avoid excessive energy consumption because of the increasing in transmission distance. In addition, we use the residual energy and average energy consumption of the nodes within the cluster to choose the cluster heads, use the multi hop transmission method to deliver the data, and dynamically adjust the transmission radius according to the load conditions. Finally, we use the auxiliary cluster members to change the delivering path according to the residual energy of the cluster head in order to its load. Finally, we compare the proposed method with the related algorithms via simulated experiments and then analyze the results. It reveals that the proposed method outperforms other algorithms in the numbers of used rounds and the average energy consumption.

Keywords: auxiliary nodes, cluster, load balance, routing algorithm, wireless sensor network

Procedia PDF Downloads 275
3512 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling

Authors: Masoud Safdari, Jacob Fish

Abstract:

Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.

Keywords: atomistic, continuum, coupling, multiscale

Procedia PDF Downloads 177
3511 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 76
3510 Signs, Signals and Syndromes: Algorithmic Surveillance and Global Health Security in the 21st Century

Authors: Stephen L. Roberts

Abstract:

This article offers a critical analysis of the rise of syndromic surveillance systems for the advanced detection of pandemic threats within contemporary global health security frameworks. The article traces the iterative evolution and ascendancy of three such novel syndromic surveillance systems for the strengthening of health security initiatives over the past two decades: 1) The Program for Monitoring Emerging Diseases (ProMED-mail); 2) The Global Public Health Intelligence Network (GPHIN); and 3) HealthMap. This article demonstrates how each newly introduced syndromic surveillance system has become increasingly oriented towards the integration of digital algorithms into core surveillance capacities to continually harness and forecast upon infinitely generating sets of digital, open-source data, potentially indicative of forthcoming pandemic threats. This article argues that the increased centrality of the algorithm within these next-generation syndromic surveillance systems produces a new and distinct form of infectious disease surveillance for the governing of emergent pathogenic contingencies. Conceptually, the article also shows how the rise of this algorithmic mode of infectious disease surveillance produces divergences in the governmental rationalities of global health security, leading to the rise of an algorithmic governmentality within contemporary contexts of Big Data and these surveillance systems. Empirically, this article demonstrates how this new form of algorithmic infectious disease surveillance has been rapidly integrated into diplomatic, legal, and political frameworks to strengthen the practice of global health security – producing subtle, yet distinct shifts in the outbreak notification and reporting transparency of states, increasingly scrutinized by the algorithmic gaze of syndromic surveillance.

Keywords: algorithms, global health, pandemic, surveillance

Procedia PDF Downloads 187