Morgan & Claypool:
Language: EnglishClear FilterDownload Filtered
This book is based on a set of 18 class-tested lectures delivered to fourth-year physics undergraduates at Grifï¬th University in Brisbane, and the book presents new discoveries by the Nobel-prize winning LIGO collaboration. The author begins with a review of special relativity and tensors and then develops the basic elements of general relativity (a beautiful theory that unifies special relativity and gravitation via geometry) with applications to the gravitational deflection of light, global positioning systems, black holes, gravitational waves, and cosmology. The book provides readers with a solid understanding of the underlying physical concepts; an ability to appreciate and in many cases derive important applications of the theory; and a solid grounding for those wishing to pursue their studies further. General Relativity: An Introduction to Black Holes, Gravitational Waves, and Cosmology also connects general relativity with broader topics. There is no doubt that general relativity is an active and exciting field of physics, and this book successfully transmits that excitement to readers.
The first version of quantum theory, developed in the mid 1920's, is what is called nonrelativistic quantum theory; it is based on a form of relativity which, in a previous volume, was called Newton relativity. But quickly after this first development, it was realized that, in order to account for high energy phenomena such as particle creation, it was necessary to develop a quantum theory based on Einstein relativity. This in turn led to the development of relativistic quantum field theory, which is an intrinsically many-body theory.
But this is not the only possibility for a relativistic quantum theory. In this book we take the point of view of a particle theory, based on the irreducible representations of the Poincare group, the group that expresses the symmetry of Einstein relativity. There are several ways of formulating such a theory; we develop what is called relativistic point form quantum mechanics, which, unlike quantum field theory, deals with a fixed number of particles in a relativistically invariant way.
A central issue in any relativistic quantum theory is how to introduce interactions without spoiling relativistic invariance. We show that interactions can be incorporated in a mass operator, in such a way that relativistic invariance is maintained. Surprisingly for a relativistic theory, such a construction allows for instantaneous interactions; in addition, dynamical particle exchange and particle production can be included in a multichannel formulation of the mass operator. For systems of more than two particles, however, straightforward application of such a construction leads to the undesirable property that clusters of widely separated particles continue to interact with one another, even if the interactions between the individual particles are of short range. A significant part of this volume deals with the solution of this problem.
Since relativistic quantum mechanics is not as well-known as relativistic quantum field theory, a chapter is devoted to applications of point form quantum mechanics to nuclear physics; in particular we show how constituent quark models can be used to derive electromagnetic and other properties of hadrons.
Renewable energy (RE) is a subject of great interest today. It is one of the two main means for implementing climate change mitigation programmes, and presently the only perceived means for replacing the declining global fossil fuel reserves. It also helps fight poverty and assists in the global quest for gender equity by taking clean energy where it is needed most for development.
It is perhaps not surprising therefore that there is so much coverage of RE in both the conventional media and the internet by media and tech writers, economists and bloggers, many of who only have a partial understanding of the technology itself. The end result is mostly promotional rhetoric that says little about the true value of the technology, and leads to a confused picture for the serious individual or decision-maker who wants to know what the technology is really capable of doing.
This book provides a clear and factual picture of the status of RE and its capabilities today. The need for such a book was first realized by the author when he was engaged in a renewable energy capacity-building project encompassing countries from Europe, the Caribbean, Africa, and the Pacific. The book is largely non-technical in nature; it does however contain enough mention of the science and technology to enable readers to go further with their own investigations should they wish to.
The book covers all areas of renewable energy (RE), starting from biomass energy and hydropower and proceeding to wind, solar and geothermal energy before ending with an overview of ocean energy. It begins with a simple introduction to the physical principles of the RE technologies, followed by an enumeration of the requirements for their successful implementation. The last two chapters consider how the technologies are actually being implemented today and their roles in climate change mitigation and poverty alleviation.
This book introduces zero-effort technologies (ZETs), an emerging class of technologies that require little or no effort from the people who use them. ZETs use advanced computing techniques, such as computer vision, sensor fusion, decision-making and planning, machine learning, and the Internet of Things to autonomously perform the collection, analysis, and application of data about the user and/or his/her context. This book begins with an overview of ZETs, then presents concepts related to their development, including pervasive intelligent technologies and environments, design principles, and considerations regarding use. The book discusses select examples of the latest in ZET development before concluding with thoughts regarding future directions of the field.
With the rapid development of mobile Internet and smart personal devices in recent years, mobile search has gradually emerged as a key method with which users seek online information. In addition, cross-device search also has been regarded recently as an important research topic. As more mobile applications (APPs) integrate search functions, a user's mobile search behavior on different APPs becomes more significant. This book provides a systematic review of current mobile search analysis and studies user mobile search behavior from several perspectives, including mobile search context, APP usage, and different devices. Two different user experiments to collect user behavior data were conducted. Then, through the data from user mobile phone usage logs in natural settings, we analyze the mobile search strategies employed and offer a context-based mobile search task collection, which then can be used to evaluate the mobile search engine. In addition, we combine mobile search with APP usage to give more in-depth analysis, such as APP transition in mobile search and follow-up actions triggered by mobile search. The study, combining the mobile search with APP usage, can contribute to the interaction design of APPs, such as the search recommendation and APP recommendation. Addressing the phenomenon of users owning more smart devices today than ever before, we focus on user cross device search behavior. We model the information preparation behavior and information resumption behavior in cross-device search and evaluate the search performance in cross-device search. Research on mobile search behaviors across different devices can help to understand online user information behavior comprehensively and help users resume their search tasks on different devices.
What can Human-Computer Interaction (HCI) learn from art? How can the HCI research agenda be advanced by looking at art research? How can we improve creativity support and the amplification of that important human capability? This book aims to answer these questions. Interactive art has become a common part of life as a result of the many ways in which the computer and the Internet have facilitated it. HCI is as important to interactive art as mixing the colours of paint are to painting. This book reviews recent work that looks at these issues through art research. In interactive digital art, the artist is concerned with how the artwork behaves, how the audience interacts with it, and, ultimately, how participants experience art as well as their degree of engagement. The values of art are deeply human and increasingly relevant to HCI as its focus moves from product design towards social benefits and the support of human creativity. The book examines these issues and brings together a collection of research results from art practice that illuminates this significant new and expanding area. In particular, this work points towards a much-needed critical language that can be used to describe, compare and frame research in HCI support for creativity.
DC-DC converters have many applications in the modern world. They provide the required power to the communication backbones, they are used in digital devices like laptops and cell phones, and they have widespread applications in electric cars, to just name a few.
DC-DC converters require negative feedback to provide a suitable output voltage or current for the load. Obtaining a stable output voltage or current in presence of disturbances such as: input voltage changes and/or output load changes seems impossible without some form of control.
This book tries to train the art of controller design for DC-DC converters. Chapter 1 introduces the DC-DC converters briefly. It is assumed that the reader has the basic knowledge of DC-DC converter (i.e., a basic course in power electronics).
The reader learns the disadvantages of open loop control in Chapter 2. Simulation of DC-DC converters with the aid of Simulink® is discussed in this chapter as well. Extracting the dynamic models of DC-DC converters is studied in Chapter 3. We show how MATLAB® and a software named KUCA can be used to do the cumbersome and error-prone process of modeling automatically. Obtaining the transfer functions using PSIM® is studied as well.
These days, softwares are an integral part of engineering sciences. Control engineering is not an exception by any means. Keeping this in mind, we design the controllers using MATLAB® in Chapter 4.
Finally, references are provided at the end of each chapter to suggest more information for an interested reader. The intended audiences for this book are practice engineers and academians.
Each one of us has views about education, how discipline should function, how individuals learn, how they should be motivated, what intelligence is, and the structures (content and subjects) of the curriculum. Perhaps the most important beliefs that (beginning) teachers bring with them are their notions about what constitutes good teaching. The scholarship of teaching requires that (beginning) teachers should examine (evaluate) these views in the light of knowledge currently available about the curriculum and instruction, and decide their future actions on the basis of that analysis. Such evaluations are best undertaken when classrooms are treated as laboratories of inquiry (research) where teachers establish what works best for them.
Two instructor centred and two learner centred philosophies of knowledge, curriculum and instruction are used to discern the fundamental (basic) questions that engineering educators should answer in respect of their own beliefs and practice. They point to a series of classroom activities that will enable them to challenge their own beliefs, and at the same time affirm, develop, or change their philosophies of knowledge, curriculum and instruction.
The Fourier transform is one of the most fundamental tools for computing the frequency representation of signals. It plays a central role in signal processing, communications, audio and video compression, medical imaging, genomics, astronomy, as well as many other areas. Because of its widespread use, fast algorithms for computing the Fourier transform can benefit a large number of applications. The fastest algorithm for computing the Fourier transform is the Fast Fourier Transform (FFT), which runs in near-linear time making it an indispensable tool for many applications. However, today, the runtime of the FFT algorithm is no longer fast enough especially for big data problems where each dataset can be few terabytes. Hence, faster algorithms that run in sublinear time, i.e., do not even sample all the data points, have become necessary.
This book addresses the above problem by developing the Sparse Fourier Transform algorithms and building practical systems that use these algorithms to solve key problems in six different applications: wireless networks; mobile systems; computer graphics; medical imaging; biochemistry; and digital circuits.
This is a revised version of the thesis that won the 2016 ACM Doctoral Dissertation Award.
As human activities moved to the digital domain, so did all the well-known malicious behaviors including fraud, theft, and other trickery. There is no silver bullet, and each security threat calls for a specific answer. One specific threat is that applications accept malformed inputs, and in many cases it is possible to craft inputs that let an intruder take full control over the target computer system.
The nature of systems programming languages lies at the heart of the problem. Rather than rewriting decades of well-tested functionality, this book examines ways to live with the (programming) sins of the past while shoring up security in the most efficient manner possible. We explore a range of different options, each making significant progress towards securing legacy programs from malicious inputs.
The solutions explored include enforcement-type defenses, which excludes certain program executions because they never arise during normal operation. Another strand explores the idea of presenting adversaries with a moving target that unpredictably changes its attack surface thanks to randomization. We also cover tandem execution ideas where the compromise of one executing clone causes it to diverge from another thus revealing adversarial activities. The main purpose of this book is to provide readers with some of the most influential works on run-time exploits and defenses. We hope that the material in this book will inspire readers and generate new ideas and paradigms.
Nature continuously presents a huge number of complex and multi-scale phenomena, which in many cases, involve the presence of one or more fluids flowing, merging and evolving around us. Since its appearance on the surface of Earth, Mankind has tried to exploit and tame fluids for their purposes, probably starting with Hero's machinery to open the doors of the Temple of Serapis in Alexandria to arrive to modern propulsion systems and actuators. Today we know that fluid mechanics lies at the basis of countless scientific and technical applications from the smallest physical scales (nanofluidics, bacterial motility, and diffusive flows in porous media), to the largest (from energy production in power plants to oceanography and meteorology). It is essential to deepen the understanding of fluid behaviour across scales for the progress of Mankind and for a more sustainable and efficient future.
Since the very first years of the Third Millennium, the Lattice Boltzmann Method (LBM) has seen an exponential growth of applications, especially in the fields connected with the simulation of complex and soft matter flows. LBM, in fact, has shown a remarkable versatility in different fields of applications from nanoactive materials, free surface flows, and multiphase and reactive flows to the simulation of the processes inside engines and fluid machinery. LBM is based on an optimized formulation of Boltzmann's Kinetic Equation, which allows for the simulation of fluid particles, or rather quasi-particles, from a mesoscopic point of view thus allowing the inclusion of more fundamental physical interactions in respect to the standard schemes adopted with Navier-Stokes solvers, based on the continuum assumption.
In this book, the authors present the most recent advances of the application of the LBM to complex flow phenomena of scientific and technical interest with particular focus on the multi-scale modeling of heterogeneous catalysis within nano-porous media and multiphase, multicomponent flows.
Gradiometry is a multidisciplinary area that combines theoretical and applied physics, ultra-low noise electronics, precision engineering, and advanced signal processing. All physical fields have spatial gradients that fall with distance from their sources more rapidly than the field strength itself. This makes the gradient measurements more difficult. However, there has been a considerable investment, both in terms of time and money, into the development of various types of gradiometers driven by the extremely valuable type of information that is contained in gradients. Applications include the search for oil, gas, and mineral resources, GPS-free navigation, defence, space missions, medical research, and some other applications.
The author describes gravity gradiometers, magnetic gradiometers, and electromagnetic (EM) gradiometers. The first two types do not require any active sources of the primary physical fields whose gradients are measured, such as gravity field and ambient magnetic field. EM gradiometers do require a primary EM field, pulsed, or sinusoidal, which propagates through media and creates a secondary EM field. The latter one contains information about the non uniformness of electromagnetically active media such as conductivity and magnetic permeability contrasts. These anomalies are the boundaries of mineral deposits, oil and gas traps, underground water reserves, buried artifacts, unexploded ordnance (UXO), nuclear submarines, and even cancerous human tissue.
This book provides readers with a comprehensive introduction, history, potential applications, and current developments in relation to some of the most advanced technologies in the 21st Century. Most of the developments are strictly controlled by Defence Export Control rules and regulations, introduced in all developed countries that typically require permission to transfer relevant information from one country to another. The book is based on the materials that have been available in public domain such as scientific journals, conferences, extended abstracts, and online presentations. In addition, medical applications of EM gradiometers are exempt from any control, and some new results relevant to breast cancer early detection research are published in this book for the first time.
Our aim in this book is to present a bird's-eye view of microwave tubes (MWTs) which continue to be important despite competitive incursions from solid-state devices (SSDs). We have presented a broad and introductory survey which we hope the readers would be encouraged to read rather than going through lengthier books, and subsequently explore the field of MWTs further in selected areas of relevance to their respective interests. We hope that the present book would motivate newcomers to pursue research in MWTs and apprise them as well as decision makers of the salient features and prospects of as well as the trends of progress in MWTs. The scope of ever expanding applications of MWTs in the high power and high frequency regime will sustain and intensify the research and development in MWTs in coming years.
Computer vision has become increasingly important and effective in recent years due to its wide-ranging applications in areas as diverse as smart surveillance and monitoring, health and medicine, sports and recreation, robotics, drones, and self-driving cars. Visual recognition tasks, such as image classification, localization, and detection, are the core building blocks of many of these applications, and recent developments in Convolutional Neural Networks (CNNs) have led to outstanding performance in these state-of-the-art visual recognition tasks and systems. As a result, CNNs now form the crux of deep learning algorithms in computer vision.
This self-contained guide will benefit those who seek to both understand the theory behind CNNs and to gain hands-on experience on the application of CNNs in computer vision. It provides a comprehensive introduction to CNNs starting with the essential concepts behind neural networks: training, regularization, and optimization of CNNs. The book also discusses a wide range of loss functions, network layers, and popular CNN architectures, reviews the different techniques for the evaluation of CNNs, and presents some popular CNN tools and libraries that are commonly used in computer vision. Further, this text describes and discusses case studies that are related to the application of CNN in computer vision, including image classification, object detection, semantic segmentation, scene understanding, and image generation.
This book is ideal for undergraduate and graduate students, as no prior background knowledge in the field is required to follow the material, as well as new researchers, developers, engineers, and practitioners who are interested in gaining a quick understanding of CNN models.
This book focuses on the methodologies, organization, and communication of digital image collection research that utilizes social media content. (Image is here understood as a cultural, conventional, and commercial—stock photo—representation.) The lecture offers expert views that provide different interpretations of images and their potential implementations. Linguistic and semiotic methodologies as well as eye-tracking research are employed to both analyze images and comprehend how humans consider them, including which salient features generally attract viewers' attention.
This literature review covers image—specifically photographic—research since 2005, when major social media platforms emerged. A citation analysis includes an overview of co-citation maps that demonstrate the nexus of image research literature and the journals in which they appear. Eye tracking tests whether scholarly templates focus on the proper features of an image, such as people, objects, time, etc., and if a prescribed theme affects the eye movements of the observer. The results may point to renewed requirements for building image search engines. As it stands, image management already requires new algorithms and a new understanding that involves text recognition and very large database processing.
The aim of this book is to present different image research areas and demonstrate the challenges image research faces. The book's scope is, by necessity, far from comprehensive, since the field of digital image research does not cover fake news, image manipulation, mobile photos, etc.; these issues are very complex and need a publication of their own. This book should primarily be useful for students in library and information science, psychology, and computer science.