The example in rsf/tutorials/parameters reproduces the tutorial from Matt Hall on parameter testing. Madagascar users are encouraged to try improving the results.
In his blog post and in the discussion that follows, Matt brings up an interesting question about finding the best way for parameter selection.
For the lack of a better approach, parameter selection in seismic attributes is just an interactive game. In the Madagascar version, the key parameter for the Canny edge detector is the amount of prior anisotropic-diffusion smoothing, controlled by the smoothing radius (rect= parameter.) We can do different things with it: for example, make a movie of different images looping through different values of the radius, or, by exposing the parameter to the command-line SCons interface, build a simple GUI script for controlling it. The question posted by Matt waits for a better answer.
The example in rsf/tutorials/colormaps reproduces the tutorial from Matteo Niccoli on how to evaluate and compare color maps. The tutorial was published in the August 2014 issue of The Leading Edge. Madagascar users are encouraged to try improving the results.
Several new color palettes have been recently added to Madagascar (thanks to Aaron Stanton): color=seismic (red-yellow-white-black, popular among seismic interpreters), color=owb (orange-white-black), and color=rwb (red-white-black).
The Software Sustainability Institute in the UK has created an online petition to “everyone in the research community”, which states “We must accept that software is fundamental to research, or we will lose our ability to make groundbreaking discoveries.”
1. We want software to be treated as a valuable research object which befits the same level of investment and effort as any other aspect of the research infrastructure.
2. We want researchers to be encouraged to spend time learning about software, because the value of that knowledge is understood to improve research.
3. We want the people who develop research software to be recognised and rewarded for their invaluable contribution to research.
4. We want a research environment in which software-reliant projects are encouraged to hire software developers, rather than having to hide these valuable staff members in anonymous postdoctoral positions.
5. Ultimately, we want the research community to recognise softwares fundamental role in research.
You can sign the petition at Change.org.
sfsigmoid generates a 2-D synthetic reflectivity model, created by Jon Claerbout.
One of the first occurrences of this model is in SEP-73 sponsor report from 1992, where it appeared in several papers:
- J. F. Claerbout, 1992, Introduction to Kirchhoff Migration Programs: SEP-73 report, 361-366, Stanford Exploration Project.
- J. F. Claerbout, 1992, Filling Data Gaps Using a Local Plane-Wave Model: SEP-73 report, 401-408, Stanford Exploration Project.
- J. F. Claerbout, 1992, Information from Smiles: Mono-Plane-Annihilator Weighted Regression: SEP-73 report, 409-420, Stanford Exploration Project.
J. F. Claerbout, 1992, Crossline Regridding by Inversion: SEP-73 report, 421-428, Stanford Exploration Project.
The model was described as “a synthetic model that illustrates local variation in bedding. Notice dipping bedding, curved bedding, unconformity between them, and a fault in the curved bedding.” Later, the sigmoid model made an appearance in Claerbout’s book Basic Earth Imaging. The following example from bei/krch/sep73 illustrates the effect of aliasing on Kirchhoff modeling and migration:
The model has appeared in numerous other tests. The following example from tccs/flat/flat shows automatic flattening of the sigmoid model by predictive painting.
sfsigmoid has several parameters that control the model. The usual n1=, n2=, o1=, o2=, d1=, d2= parameters control the mesh size and sampling, taper= indicates whether to taper the sides of the model, large= controls the length of the synthetic reflectivity series. The program takes no input.
10 previous programs of the month:
A recent Report on High Performance Computing by the US Secretary of Energy Advisory Board contains a bizarre section on open source software, which states
There has been very little open source that has made its way into broad use within the HPC commercial community where great emphasis is placed on serviceability and security.
In his thoughtful blog post in response to this report, Will Schroeder, the CEO an co-founder of the legendary Kitware Inc. makes a number of strong points defending the role of open source in the past and future development of HPC. He concludes
The basic point here is that issues of scale require us to remove inefficiencies in researching, deploying, funding, and commercializing technology, and to find ways to leverage the talents of the broader community. Open source is a vital, strategic tool to do this as has been borne out by the many OS software systems now being used in HPC application… Its easy to overlook open source as a vital tool to accomplish this important goal, but in a similar way that open source Linux has revolutionized commercial computing, open source HPC software will carry us forward to meet the demands of increasingly complex computing systems.
See also Will Schroeder’s presentation The New Scientific Publishers at SciPy-2013.